如何使用AVCaptureStillImageOutput拍摄照片

我有一个预览图层,正在从相机中拉出,并按照它应该的工作。 我想按一下button就能拍照。 我已经像这样引入了AVCaptureStillImageOutput:

AVCaptureStillImageOutput *avCaptureImg = [[AVCaptureStillImageOutput alloc] init]; 

然后,我正在尝试使用此对象拍摄照片:

 [avCaptureImg captureStillImageAsynchronouslyFromConnection:(AVCaptureConnection *) completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { }]; 

我需要关于如何拍摄照片并将其保存在variables中的帮助。 谢谢

您需要确保定义AVCaptureVideoPreviewLayer并将其添加到视图图层:

 AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; [self.view.layer addSublayer:captureVideoPreviewLayer]; 

这将连接到您的AVCaptureDeviceInput

以下是完整的解决scheme:

 ///////////////////////////////////////////////// //// //// Utility to find front camera //// ///////////////////////////////////////////////// -(AVCaptureDevice *) frontFacingCameraIfAvailable{ NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; AVCaptureDevice *captureDevice = nil; for (AVCaptureDevice *device in videoDevices){ if (device.position == AVCaptureDevicePositionFront){ captureDevice = device; break; } } // couldn't find one on the front, so just get the default video device. if (!captureDevice){ captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; } return captureDevice; } ///////////////////////////////////////////////// //// //// Setup Session, attach Video Preview Layer //// and Capture Device, start running session //// ///////////////////////////////////////////////// -(void) setupCaptureSession { AVCaptureSession *session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetMedium; AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; [self.view.layer addSublayer:captureVideoPreviewLayer]; NSError *error = nil; AVCaptureDevice *device = [self frontFacingCameraIfAvailable]; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (!input) { // Handle the error appropriately. NSLog(@"ERROR: trying to open camera: %@", error); } [session addInput:input]; self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil]; [self.stillImageOutput setOutputSettings:outputSettings]; [session addOutput:self.stillImageOutput]; [session startRunning]; } ///////////////////////////////////////////////// //// //// Method to capture Still Image from //// Video Preview Layer //// ///////////////////////////////////////////////// -(void) captureNow { AVCaptureConnection *videoConnection = nil; for (AVCaptureConnection *connection in self.stillImageOutput.connections) { for (AVCaptureInputPort *port in [connection inputPorts]) { if ([[port mediaType] isEqual:AVMediaTypeVideo] ) { videoConnection = connection; break; } } if (videoConnection) { break; } } NSLog(@"about to request a capture from: %@", self.stillImageOutput); __weak typeof(self) weakSelf = self; [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) { NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; UIImage *image = [[UIImage alloc] initWithData:imageData]; [weakSelf displayImage:image]; }]; } 

对于迅捷版本:

 @IBAction func capture(sender: AnyObject) { var videoConnection :AVCaptureConnection? if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo){ stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (buffer:CMSampleBuffer!, error: NSError!) -> Void in if let exifAttachments = CMGetAttachment(buffer, kCGImagePropertyExifDictionary, nil) { let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer) self.previewImage.image = UIImage(data: imageData) UIImageWriteToSavedPhotosAlbum(self.previewImage.image, nil, nil, nil) } }) } } 
 -(void)captureImage:(NSString *)string successCallback:(void (^)(id))successCallback errorCallback:(void (^)(NSString *))errorCallback{ __block UIImage *image; AVCaptureConnection *videoConnection = nil; for (AVCaptureConnection *connection in stillImageOutput.connections) { for (AVCaptureInputPort *port in [connection inputPorts]) { if ([[port mediaType] isEqual:AVMediaTypeVideo] ) { videoConnection = connection; break; } } if (videoConnection) { break; } } //NSLog(@"about to request a capture from: %@", stillImageOutput); [videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait]; [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) { CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL); if (exifAttachments) { // Do something with the attachments. //NSLog(@"attachements: %@", exifAttachments); } else { //NSLog(@"no attachments"); } NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; image = [[UIImage alloc] initWithData:imageData]; successCallback(image); //UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil); }]; NSError *error; if (error) { errorCallback(@"error"); }else{ } } 

不知道为什么我没有看到这个更快:

iPhone SDK 4 AVFoundation – 如何正确使用captureStillImageAsynchronouslyFromConnection?

亚当斯回答的工作太棒了!