AVAssetWriterInputPixelBufferAdaptor和CMTime

我正在用AVAssetWriterInputPixelBufferAdaptor写一些帧到video,而行为与时间不是我所期望的。

如果我只写一帧:

  [videoWriter startSessionAtSourceTime:kCMTimeZero]; [adaptor appendPixelBuffer:pxBuffer withPresentationTime:kCMTimeZero]; 

这给我一个长度为零的video,这是我所期望的。

但是如果我继续添加第二帧:

  // 3000/600 = 5 sec, right? CMTime nextFrame = CMTimeMake(3000, 600); [adaptor appendPixelBuffer:pxBuffer withPresentationTime:nextFrame]; 

我得到了十秒的video,我期待五个。

这里发生了什么? withPresentationTime是否以某种方式设置了帧的开始和持续时间?

请注意,我没有调用endSessionAtSourceTime ,只是finishWriting

尝试看看这个例子,逆向工程5秒后添加1帧…

这里是示例代码链接:git@github.com:RudyAramayo / AVAssetWriterInputPixelBufferAdaptorSample.git

这是您需要的代码:

 - (void) testCompressionSession { CGSize size = CGSizeMake(480, 320); NSString *betaCompressionDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"]; NSError *error = nil; unlink([betaCompressionDirectory UTF8String]); //----initialize compression engine AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory] fileType:AVFileTypeQuickTimeMovie error:&error]; NSParameterAssert(videoWriter); if(error) NSLog(@"error = %@", [error localizedDescription]); NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:size.width], AVVideoWidthKey, [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil]; AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings]; NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil]; AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary]; NSParameterAssert(writerInput); NSParameterAssert([videoWriter canAddInput:writerInput]); if ([videoWriter canAddInput:writerInput]) NSLog(@"I can add this input"); else NSLog(@"i can't add this input"); [videoWriter addInput:writerInput]; [videoWriter startWriting]; [videoWriter startSessionAtSourceTime:kCMTimeZero]; //--- // insert demo debugging code to write the same image repeated as a movie CGImageRef theImage = [[UIImage imageNamed:@"Lotus.png"] CGImage]; dispatch_queue_t dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL); int __block frame = 0; [writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{ while ([writerInput isReadyForMoreMediaData]) { if(++frame >= 120) { [writerInput markAsFinished]; [videoWriter finishWriting]; [videoWriter release]; break; } CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:theImage size:size]; if (buffer) { if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, 20)]) NSLog(@"FAIL"); else NSLog(@"Success:%d", frame); CFRelease(buffer); } } }]; NSLog(@"outside for loop"); } - (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size { NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; CVPixelBufferRef pxbuffer = NULL; CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer); // CVReturn status = CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pxbuffer); NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); CVPixelBufferLockBaseAddress(pxbuffer, 0); void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); NSParameterAssert(pxdata != NULL); CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst); NSParameterAssert(context); CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image); CGColorSpaceRelease(rgbColorSpace); CGContextRelease(context); CVPixelBufferUnlockBaseAddress(pxbuffer, 0); return pxbuffer; } 

你有没有尝试使用这个作为你的第一个电话

 CMTime t = CMTimeMake(0, 600); [videoWriter startSessionAtSourceTime:t]; [adaptor appendPixelBuffer:pxBuffer withPresentationTime:t];