内存泄漏在CMSampleBufferGetImageBuffer

我得到一个CMSampleBufferRefvideo缓冲区每N个video帧的UIImage ,如:

- (void)imageFromVideoBuffer:(void(^)(UIImage* image))completion { CMSampleBufferRef sampleBuffer = _myLastSampleBuffer; if (sampleBuffer != nil) { CFRetain(sampleBuffer); CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)]; _lastAppendedVideoBuffer.sampleBuffer = nil; if (_context == nil) { _context = [CIContext contextWithOptions:nil]; } CVPixelBufferRef buffer = CMSampleBufferGetImageBuffer(sampleBuffer); CGImageRef cgImage = [_context createCGImage:ciImage fromRect: CGRectMake(0, 0, CVPixelBufferGetWidth(buffer), CVPixelBufferGetHeight(buffer))]; __block UIImage *image = [UIImage imageWithCGImage:cgImage]; CGImageRelease(cgImage); CFRelease(sampleBuffer); if(completion) completion(image); return; } if(completion) completion(nil); } 

XCode和仪器检测到内存泄漏,但我无法摆脱它。 我像往常一样释放CGImageRef和CMSampleBufferRef:

 CGImageRelease(cgImage); CFRelease(sampleBuffer); 

[更新]我把AVCapture输出callback获取sampleBuffer

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { if (captureOutput == _videoOutput) { _lastVideoBuffer.sampleBuffer = sampleBuffer; id<CIImageRenderer> imageRenderer = _CIImageRenderer; dispatch_async(dispatch_get_main_queue(), ^{ @autoreleasepool { CIImage *ciImage = nil; ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)]; if(_context==nil) { _context = [CIContext contextWithOptions:nil]; } CGImageRef processedCGImage = [_context createCGImage:ciImage fromRect:[ciImage extent]]; //UIImage *image=[UIImage imageWithCGImage:processedCGImage]; CGImageRelease(processedCGImage); NSLog(@"Captured image %@", ciImage); } }); 

泄漏的代码是createCGImage:ciImage

 CGImageRef processedCGImage = [_context createCGImage:ciImage fromRect:[ciImage extent]]; 

甚至有一个autoreleasepoolCGImage引用的CGImage和一个CIContext作为实例属性。

这似乎是在这里解决相同的问题: 不能将CIImage保存到iOS上的文件没有内存泄漏

[更新]泄漏似乎是由于一个错误。 这个问题在iOS 9上的CIContext createCGImage的内存泄漏中有很好的描述?

示例项目显示如何重现此泄漏: http : //www.osamu.co.jp/DataArea/VideoCameraTest.zip

最后的评论保证

看起来他们在9.1b3中修正了这个问题。 如果有人需要一个可以在iOS 9.0.x上运行的解决方法,我可以使用这个工作:

在一个testing代码(在这种情况下,Swift):

 [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) { if (error) return; __block NSString *filePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"ipdf_pic_%i.jpeg",(int)[NSDate date].timeIntervalSince1970]]; NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; dispatch_async(dispatch_get_main_queue(), ^ { @autoreleasepool { CIImage *enhancedImage = [CIImage imageWithData:imageData]; if (!enhancedImage) return; static CIContext *ctx = nil; if (!ctx) ctx = [CIContext contextWithOptions:nil]; CGImageRef imageRef = [ctx createCGImage:enhancedImage fromRect:enhancedImage.extent format:kCIFormatBGRA8 colorSpace:nil]; UIImage *image = [UIImage imageWithCGImage:imageRef scale:1.0 orientation:UIImageOrientationRight]; [[NSFileManager defaultManager] createFileAtPath:filePath contents:UIImageJPEGRepresentation(image, 0.8) attributes:nil]; CGImageRelease(imageRef); } }); }]; 

和iOS9.0的解决方法应该是

 extension CIContext { func createCGImage_(image:CIImage, fromRect:CGRect) -> CGImage { let width = Int(fromRect.width) let height = Int(fromRect.height) let rawData = UnsafeMutablePointer<UInt8>.alloc(width * height * 4) render(image, toBitmap: rawData, rowBytes: width * 4, bounds: fromRect, format: kCIFormatRGBA8, colorSpace: CGColorSpaceCreateDeviceRGB()) let dataProvider = CGDataProviderCreateWithData(nil, rawData, height * width * 4) {info, data, size in UnsafeMutablePointer<UInt8>(data).dealloc(size)} return CGImageCreate(width, height, 8, 32, width * 4, CGColorSpaceCreateDeviceRGB(), CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue), dataProvider, nil, false, .RenderingIntentDefault)! } } 

在我们创build的应用程序中,我们遇到了类似的问题,我们正在使用OpenCV处理每个帧的特征关键点,并每隔几秒发送一帧。 经过一段时间的运行,我们最终会得到相当多的内存压力信息。

我们设法通过在自己的自动释放池中运行我们的处理代码来纠正这个问题,像这样(jpegDataFromSampleBufferAndCrop类似于你正在做的事情,添加了裁剪):

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { @autoreleasepool { if ([self.lastFrameSentAt timeIntervalSinceNow] < -kContinuousRateInSeconds) { NSData *imageData = [self jpegDataFromSampleBufferAndCrop:sampleBuffer]; if (imageData) { [self processImageData:imageData]; } self.lastFrameSentAt = [NSDate date]; imageData = nil; } } } } 

我可以确认这个内存泄漏在iOS 9.2上仍然存在。 (我也发布在苹果开发者论坛上 。)

我在iOS 9.2上遇到了相同的内存泄漏。 我testing了使用MetalKit和MLKDevice来删除EAGLContext。 我已经testing了不同的CIContext的方法像drawImage,createCGImage和渲染,但似乎没有任何工作。

很明显,这是一个iOS 9的错误。通过从Apple下载示例应用程序(请参见下文)来尝试一下自己,然后在iOS 8.4的设备上运行相同的项目,然后在iOS 9.2的设备并留意Xcode中的内存量表。

下载https://developer.apple.com/library/ios/samplecode/AVBasicVideoOutput/Introduction/Intro.html#//apple_ref/doc/uid/DTS40013109

将其添加到APLEAGLView.h:20

 @property (strong, nonatomic) CIContext* ciContext; 

用这个replaceAPLEAGLView.m:118

 [EAGLContext setCurrentContext:_context]; _ciContext = [CIContext contextWithEAGLContext:_context]; 

最后用这个replaceAPLEAGLView.m:341-343

 glDrawArrays(GL_TRIANGLE_STRIP, 0, 4); @autoreleasepool { CIImage* sourceImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; CIFilter* filter = [CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:kCIInputImageKey, sourceImage, nil]; CIImage* filteredImage = filter.outputImage; [_ciContext render:filteredImage toCVPixelBuffer:pixelBuffer]; } glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);