如何捕捉图像,而不显示在iOS预览

我想要在特定的情况下捕捉图像,例如按下button时; 但我不想显示任何video预览屏幕。 我猜captureStillImageAsynchronouslyFromConnection是我需要用于这种情况。 目前,如果我展示video预览,我可以捕捉图像。 但是,如果我删除代码以显示预览,则应用程序会崩溃,并显示以下输出:

由于未捕获的exception'NSInvalidArgumentException',原因:'*** – [AVCaptureStillImageOutput captureStillImageAsynchronouslyFromConnection:completionHandler:] – 非活动/无效的连接通过终止应用程序。 “ ***第一掷调用堆栈:(0x336ee8bf 0x301e21e5 0x3697c35d 0x34187 0x33648435 0x310949eb 0x310949a7 0x31094985 0x310946f5 0x3109502d 0x3109350f 0x31092f01 0x310794ed 0x31078d2d 0x37db7df3 0x336c2553 0x336c24f5 0x336c1343 0x336444dd 0x336443a5 0x37db6fcd 0x310a7743 0x33887 0x3382c)终止叫做抛出exception(LLDB)

所以这里是我的实现:

BIDViewController.h:

#import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> @interface BIDViewController : UIViewController { AVCaptureStillImageOutput *stillImageOutput; } @property (strong, nonatomic) IBOutlet UIView *videoPreview; - (IBAction)doCap:(id)sender; @end 

BIDViewController.m中的相关人员:

 #import "BIDViewController.h" @interface BIDViewController () @end @implementation BIDViewController @synthesize capturedIm; @synthesize videoPreview; - (void)viewDidLoad { [super viewDidLoad]; [self setupAVCapture]; } - (BOOL)setupAVCapture { NSError *error = nil; AVCaptureSession *session = [AVCaptureSession new]; [session setSessionPreset:AVCaptureSessionPresetHigh]; /* AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; captureVideoPreviewLayer.frame = self.videoPreview.bounds; [self.videoPreview.layer addSublayer:captureVideoPreviewLayer]; */ // Select a video device, make an input AVCaptureDevice *backCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:backCamera error:&error]; if (error) return NO; if ([session canAddInput:input]) [session addInput:input]; // Make a still image output stillImageOutput = [AVCaptureStillImageOutput new]; NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil]; [stillImageOutput setOutputSettings:outputSettings]; if ([session canAddOutput:stillImageOutput]) [session addOutput:stillImageOutput]; [session startRunning]; return YES; } - (IBAction)doCap:(id)sender { AVCaptureConnection *videoConnection = nil; for (AVCaptureConnection *connection in stillImageOutput.connections) { for (AVCaptureInputPort *port in [connection inputPorts]) { if ([[port mediaType] isEqual:AVMediaTypeVideo] ) { videoConnection = connection; break; } } if (videoConnection) { break; } } [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *__strong error) { // Do something with the captured image }]; } 

通过上面的代码,如果doCap被调用,那么会发生崩溃。 另一方面,如果我在setupAVCapture函数中删除以下注释

 /* AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; captureVideoPreviewLayer.frame = self.videoPreview.bounds; [self.videoPreview.layer addSublayer:captureVideoPreviewLayer]; */ 

那么它的工作没有任何问题。

总之,我的问题是,如何在受控实例下捕获图像而不显示预览?

我使用以下代码从前置摄像头(如果可用)或使用后置摄像头进行捕获。 适用于我的iPhone 4S。

 -(void)viewDidLoad{ AVCaptureSession *session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetMedium; AVCaptureDevice *device = [self frontFacingCameraIfAvailable]; NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (!input) { // Handle the error appropriately. NSLog(@"ERROR: trying to open camera: %@", error); } [session addInput:input]; //stillImageOutput is a global variable in .h file: "AVCaptureStillImageOutput *stillImageOutput;" stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil]; [stillImageOutput setOutputSettings:outputSettings]; [session addOutput:stillImageOutput]; [session startRunning]; } -(AVCaptureDevice *)frontFacingCameraIfAvailable{ NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; AVCaptureDevice *captureDevice = nil; for (AVCaptureDevice *device in videoDevices){ if (device.position == AVCaptureDevicePositionFront){ captureDevice = device; break; } } // couldn't find one on the front, so just get the default video device. if (!captureDevice){ captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; } return captureDevice; } -(IBAction)captureNow{ AVCaptureConnection *videoConnection = nil; for (AVCaptureConnection *connection in stillImageOutput.connections){ for (AVCaptureInputPort *port in [connection inputPorts]){ if ([[port mediaType] isEqual:AVMediaTypeVideo]){ videoConnection = connection; break; } } if (videoConnection) { break; } } NSLog(@"about to request a capture from: %@", stillImageOutput); [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error){ CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL); if (exifAttachments){ // Do something with the attachments if you want to. NSLog(@"attachements: %@", exifAttachments); } else NSLog(@"no attachments"); NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; UIImage *image = [[UIImage alloc] initWithData:imageData]; self.vImage.image = image; }]; } 

那么,我遇到了类似的问题,通过captureStillImageAsynchronouslyFromConnection:stillImageConnection引发了一个exception,传递的connection是无效的。 后来,我发现当我为会话创buildpropertiesstillImageOutPut保留值时,问题得到解决。

Interesting Posts