iOS-CVPixelBuffer创build内存无法正确释放时,图像到video

我正在把图像制作成video。 但总是因为内存警告而崩溃,CVPixelBufferCreate上分配太多。 不知道如何处理它的权利。 我见过很多类似的话题,没有一个能够解决我的问题。

在这里输入图像说明

这是我的代码:

- (void) writeImagesArray:(NSArray*)array asMovie:(NSString*)path { NSError *error = nil; UIImage *first = [array objectAtIndex:0]; CGSize frameSize = first.size; AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL: [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie error:&error]; NSParameterAssert(videoWriter); NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithDouble:frameSize.width],AVVideoWidthKey, [NSNumber numberWithDouble:frameSize.height], AVVideoHeightKey, nil]; AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings]; self.adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil]; [videoWriter addInput:writerInput]; //Start Session [videoWriter startWriting]; [videoWriter startSessionAtSourceTime:kCMTimeZero]; int frameCount = 0; CVPixelBufferRef buffer = NULL; for(UIImage *img in array) { buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize]; if (self.adaptor.assetWriterInput.readyForMoreMediaData) { CMTime frameTime = CMTimeMake(frameCount,FPS); [self.adaptor appendPixelBuffer:buffer withPresentationTime:frameTime]; } if(buffer) CVPixelBufferRelease(buffer); frameCount++; } [writerInput markAsFinished]; [videoWriter finishWritingWithCompletionHandler:^{ if (videoWriter.status == AVAssetWriterStatusFailed) { NSLog(@"Movie save failed."); }else{ NSLog(@"Movie saved."); } }]; NSLog(@"Finished.); } - (CVPixelBufferRef)newPixelBufferFromCGImage: (CGImageRef) image andFrameSize:(CGSize)frameSize { NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; CVPixelBufferRef pxbuffer = NULL; CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width, frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options, &pxbuffer); NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); CVPixelBufferLockBaseAddress(pxbuffer, 0); void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); NSParameterAssert(pxdata != NULL); CGBitmapInfo bitmapInfo = (CGBitmapInfo) kCGImageAlphaNoneSkipFirst; CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width, frameSize.height, 8, 4*frameSize.width, rgbColorSpace, bitmapInfo); NSParameterAssert(context); CGContextConcatCTM(context, CGAffineTransformMakeRotation(0)); CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image); CGColorSpaceRelease(rgbColorSpace); CGContextRelease(context); CVPixelBufferUnlockBaseAddress(pxbuffer, 0); return pxbuffer; } 

更新:

我把我的video分成了小部分。 添加[NSThread sleepForTimeInterval:0.00005]; 在循环。 内存只是神奇的释放。

但是,这导致我的用户界面停留了几秒钟,因为这条线。 更好的解决scheme?

 for(UIImage *img in array) { buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize]; //CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, adaptor.pixelBufferPool, &buffer); if (adaptor.assetWriterInput.readyForMoreMediaData) { CMTime frameTime = CMTimeMake(frameCount,FPS); [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime]; } if(buffer) CVPixelBufferRelease(buffer); frameCount++; [NSThread sleepForTimeInterval:0.00005]; } 

这是内存:

在这里输入图像说明

从对代码的快速回顾中,我无法看到CVBuffer本身的pipe理有什么问题。
我认为这可能是你的问题的来源是UIImagearrays。
UIImage有这种行为,直到您请求CGImage属性或绘制它,附加的图像不在内存中解码,所以未使用的图像的内存影响是低的。
你的枚举调用每个图像的CGImage属性,你永远不会摆脱它们,这可以解释内存分配的不断增加。

如果以后不使用Images 。 你可以这样做:

  [images enumerateObjectsUsingBlock:^(UIImage * _Nonnull img, NSUInteger idx, BOOL * _Nonnull stop) { CVPixelBufferRef pixelBuffer = [self pixelBufferFromCGImage:img.CGImage frameSize:[VDVideoEncodeConfig globalConfig].size]; CMTime frameTime = CMTimeMake(frameCount, (int32_t)[VDVideoEncodeConfig globalConfig].frameRate); frameCount++; [_assetRW appendNewSampleBuffer:pixelBuffer pst:frameTime]; CVPixelBufferRelease(pixelBuffer); // This can release the memory // The Image.CGImageRef result in the memory leak you see in the Instruments images[idx] = [NSNull null]; }];