困惑什么对象实际上包含捕获的图像时使用AVFoundation

我有一个使用AVFoundation的照片应用程序。 到目前为止,一切正常。

然而,令我困惑的一件事情是,被捕获的图像实际上包含了什么对象?

我已经NSLogging所有的对象和他们的一些属性,我仍然不知道捕获的图像包含在哪里。

这是我设置捕获会话的代码:

self.session =[[AVCaptureSession alloc]init]; [self.session setSessionPreset:AVCaptureSessionPresetPhoto]; self.inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; NSError *error; self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.inputDevice error:&error]; if([self.session canAddInput:self.deviceInput]) [self.session addInput:self.deviceInput]; self.previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session]; self.rootLayer = [[self view]layer]; [self.rootLayer setMasksToBounds:YES]; [self.previewLayer setFrame:CGRectMake(0, 0, self.rootLayer.bounds.size.width, self.rootLayer.bounds.size.height)]; [self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill]; [self.rootLayer insertSublayer:self.previewLayer atIndex:0]; self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; [self.session addOutput:self.stillImageOutput]; [self.session startRunning]; } 

然后这里是我的代码捕获静态图像时,用户按下捕捉button:

 -(IBAction)stillImageCapture { AVCaptureConnection *videoConnection = nil; videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait; for (AVCaptureConnection *connection in self.stillImageOutput.connections){ for (AVCaptureInputPort *port in [connection inputPorts]){ if ([[port mediaType] isEqual:AVMediaTypeVideo]){ videoConnection = connection; break; } } if (videoConnection) { break; } } [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { [self.session stopRunning]; } ];} 

当用户按下捕获button,执行上述代码时,捕获的图像被成功地显示在iPhone屏幕上,但是我不能确定哪个对象实际上正在保存捕获的图像。

谢谢您的帮助。

CMSampleBuffer是实际包含图像的内容。

在你的captureStillImageAsynchronouslyFromConnection完成处理程序中,你需要像这样的东西:

 NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; UIImage* capturedImage = [[UIImage alloc] initWithData:imageData]; 

我的工作实施:

 - (void)captureStillImage { @try { AVCaptureConnection *videoConnection = nil; for (AVCaptureConnection *connection in _stillImageOutput.connections){ for (AVCaptureInputPort *port in [connection inputPorts]){ if ([[port mediaType] isEqual:AVMediaTypeVideo]){ videoConnection = connection; break; } } if (videoConnection) { break; } } NSLog(@"About to request a capture from: %@", [self stillImageOutput]); [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { // This is here for when we need to implement Exif stuff. //CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL); NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; // Create a UIImage from the sample buffer data _capturedImage = [[UIImage alloc] initWithData:imageData]; BOOL autoSave = YES; if (autoSave) { UIImageWriteToSavedPhotosAlbum(_capturedImage, self, @selector(image:didFinishSavingWithError:contextInfo:), nil); } }]; } @catch (NSException *exception) { NSlog(@"ERROR: Unable to capture still image from AVFoundation camera: %@", exception); } }