AVAssetWriter祸患

我正在尝试使用AVAssetWriter将CGImages写入文件以从图像创buildvideo。

我已经在模拟器上以三种不同的方式成功地完成了这个工作,但是每个方法在运行iOS 4.3的iPhone 4上都失败了。

这一切都与像素缓冲区有关。

我的第一个方法是根据需要创build像素缓冲区而不使用池。 这是有效的,但是在设备上工作的内存太多了。

我的第二个方法是使用推荐的AVAssetWriterInputPixelBufferAdaptor,然后从CVPixelBufferPoolCreatePixelBuffer从适配器pixelBufferPool中提取像素缓冲区。

这也适用于模拟器,但在设备上失败,因为适配器的像素缓冲池永远不会被分配。 我没有得到任何错误消息。

最后,我试图用CVPixelBufferPoolCreate创build自己的像素缓冲池。 这也适用于模拟器,但在设备上,一切工作正常,直到我尝试appendPixelBuffer追加像素缓冲区,每次失败。

我在网上发现了很less的信息。 我已经将我的代码基于我发现的例子,但现在几天没有运气。 如果任何人有成功AVAssetWriter这样做的经验,请看看,让我知道,如果你看到任何不合适的地方。

注:您将看到注释掉的尝试块。

首先,设置

- (BOOL) openVideoFile: (NSString *) path withSize:(CGSize)imageSize { size = CGSizeMake (480.0, 320.0);//imageSize; NSError *error = nil; videoWriter = [[AVAssetWriter alloc] initWithURL: [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie error:&error]; if (error != nil) return NO; NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithDouble:size.width], AVVideoCleanApertureWidthKey, [NSNumber numberWithDouble:size.height], AVVideoCleanApertureHeightKey, [NSNumber numberWithInt:10], AVVideoCleanApertureHorizontalOffsetKey, [NSNumber numberWithInt:10], AVVideoCleanApertureVerticalOffsetKey, nil]; NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:1], AVVideoPixelAspectRatioHorizontalSpacingKey, [NSNumber numberWithInt:1],AVVideoPixelAspectRatioVerticalSpacingKey, nil]; NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys: //[NSNumber numberWithInt:960000], AVVideoAverageBitRateKey, // [NSNumber numberWithInt:1],AVVideoMaxKeyFrameIntervalKey, videoCleanApertureSettings, AVVideoCleanApertureKey, videoAspectRatioSettings, AVVideoPixelAspectRatioKey, //AVVideoProfileLevelH264Main31, AVVideoProfileLevelKey, nil]; NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, codecSettings,AVVideoCompressionPropertiesKey, [NSNumber numberWithDouble:size.width], AVVideoWidthKey, [NSNumber numberWithDouble:size.height], AVVideoHeightKey, nil]; writerInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain]; NSMutableDictionary * bufferAttributes = [[NSMutableDictionary alloc] init]; [bufferAttributes setObject: [NSNumber numberWithInt: kCVPixelFormatType_32ARGB] forKey: (NSString *) kCVPixelBufferPixelFormatTypeKey]; [bufferAttributes setObject: [NSNumber numberWithInt: 480] forKey: (NSString *) kCVPixelBufferWidthKey]; [bufferAttributes setObject: [NSNumber numberWithInt: 320] forKey: (NSString *) kCVPixelBufferHeightKey]; //NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil]; //[bufferAttributes setObject: [NSNumber numberWithInt: 640] // forKey: (NSString *) kCVPixelBufferWidthKey]; //[bufferAttributes setObject: [NSNumber numberWithInt: 480] // forKey: (NSString *) kCVPixelBufferHeightKey]; adaptor = [[AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil] retain]; //CVPixelBufferPoolCreate (kCFAllocatorSystemDefault,NULL,(CFDictionaryRef)bufferAttributes,&pixelBufferPool); //Create buffer pool NSMutableDictionary* attributes; attributes = [NSMutableDictionary dictionary]; int width = 480; int height = 320; [attributes setObject:[NSNumber numberWithInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey]; [attributes setObject:[NSNumber numberWithInt:width] forKey: (NSString*)kCVPixelBufferWidthKey]; [attributes setObject:[NSNumber numberWithInt:height] forKey: (NSString*)kCVPixelBufferHeightKey]; CVReturn theError = CVPixelBufferPoolCreate(kCFAllocatorDefault, NULL, (CFDictionaryRef) attributes, &pixelBufferPool); NSParameterAssert(writerInput); NSParameterAssert([videoWriter canAddInput:writerInput]); [videoWriter addInput:writerInput]; writerInput.expectsMediaDataInRealTime = YES; //Start a session: [videoWriter startWriting]; [videoWriter startSessionAtSourceTime:kCMTimeZero]; buffer = NULL; lastTime = kCMTimeZero; presentTime = kCMTimeZero; return YES; } 

接下来,追加写入器和创build像素缓冲区的两种方法追加。

 - (void) writeImageToMovie:(CGImageRef)image { if([writerInput isReadyForMoreMediaData]) { // CMTime frameTime = CMTimeMake(1, 20); // CMTime lastTime=CMTimeMake(i, 20); //i is from 0 to 24 of the loop above // CMTime presentTime=CMTimeAdd(lastTime, frameTime); buffer = [self pixelBufferFromCGImage:image]; BOOL success = [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime]; if (!success) NSLog(@"Failed to appendPixelBuffer"); CVPixelBufferRelease(buffer); presentTime = CMTimeAdd(lastTime, CMTimeMake(5, 1000)); lastTime = presentTime; } else { NSLog(@"error - writerInput not ready"); } } - (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image { CVPixelBufferRef pxbuffer; NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; if (pixelBufferPool == NULL) NSLog(@"pixelBufferPool is null!"); CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, pixelBufferPool, &pxbuffer); /*if (pxbuffer == NULL) { CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer); }*/ //NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); CVPixelBufferLockBaseAddress(pxbuffer, 0); void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); //NSParameterAssert(pxdata != NULL); CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaNoneSkipFirst); //NSParameterAssert(context); CGContextConcatCTM(context, CGAffineTransformMakeRotation(0)); CGContextDrawImage(context, CGRectMake(90, 10, CGImageGetWidth(image), CGImageGetHeight(image)), image); CGColorSpaceRelease(rgbColorSpace); CGContextRelease(context); CVPixelBufferUnlockBaseAddress(pxbuffer, 0); return pxbuffer; } 

我find了解决这个问题的方法。

如果您想让AVAudioPlayer和AVAssetWriter一起正确运行,您必须具有“可混合”的audio会话类别。

您可以使用类似AVAudioSessionCategoryAmbient的混合类别。

不过,我需要使用AVAudioSessionCategoryPlayAndRecord。

你可以通过实现这个设置任何类别的混合:

 OSStatus propertySetError = 0; UInt32 allowMixing = true; propertySetError = AudioSessionSetProperty ( kAudioSessionProperty_OverrideCategoryMixWithOthers, // 1 sizeof (allowMixing), // 2 &allowMixing // 3 ); 

那么,首先你需要在创build适配器对象时传递一些bufferAttributes

  NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil]; AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:_videoWriterInput sourcePixelBufferAttributes:bufferAttributes]; 

然后移除对CVPixelBufferPoolCreate调用,在适配器对象中已经创build了一个像素缓冲池,所以只需调用它:

  CVPixelBufferRef pixelBuffer = NULL; CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &pixelBuffer); CVPixelBufferLockBaseAddress(pixelBuffer, 0); // ...fill the pixelbuffer here CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); CMTime frameTime = CMTimeMake(frameCount,(int32_t) 30); BOOL res = [adaptor appendPixelBuffer:pixelBuffer withPresentationTime:frameTime]; CVPixelBufferRelease(pixelBuffer); CFRelease(sampleBuffer); 

我认为应该这样做,我曾经有过类似的错误,我通过创build适配器和像素缓冲区来解决这个问题。