使用AVCaptureStillImageOutput有什么办法来缩短镜头间的时间吗?

我目前使用下面的代码拍摄一系列图像:

- (void)shootSeries:(int)photos { if(photos == 0) { [self mergeImages]; } else { [output captureStillImageAsynchronouslyFromConnection:connection completionHandler: ^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { NSLog(@"Shot picture %d.", 7 - photos); [self shootSeries:(photos - 1)]; CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer); CVPixelBufferLockBaseAddress(pixelBuffer, 0); int dataSize = CVPixelBufferGetDataSize(pixelBuffer); CFDataRef data = CFDataCreate(NULL, (const UInt8 *)CVPixelBufferGetBaseAddress(pixelBuffer), dataSize); CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); CGDataProviderRef dataProvider = CGDataProviderCreateWithCFData(data); CFRelease(data); CGImageRef image = CGImageCreate(CVPixelBufferGetWidth(pixelBuffer), CVPixelBufferGetHeight(pixelBuffer), 8, 32, CVPixelBufferGetBytesPerRow(pixelBuffer), colorspace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little, dataProvider, NULL, true, kCGRenderingIntentDefault); CFRelease(dataProvider); CFArrayAppendValue(shotPictures, image); CFRelease(image); }]; } } 

虽然这工作得很好,但速度很慢。 像ClearCam这样的应用程序如何能够以比这更快的速度拍摄图片,我该怎么做呢?

捕获图像后,将样本缓冲区存储在CFArray中,一旦完成了所有的手机,然后将它们转换为图像(或在您的情况CGImageRefs)。