获取相机预览AVCaptureVideoPreviewLayer

我试图让相机input显示在预览图层视图上。

self.cameraPreviewView绑定到IB中的UIView

这是我从AV Foundation编程指南中整理出来的最新代码。 但预览从不显示

AVCaptureSession *session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetHigh; AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; if (!input) { NSLog(@"Couldn't create video capture device"); } [session addInput:input]; // Create video preview layer and add it to the UI AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; UIView *view = self.cameraPreviewView; CALayer *viewLayer = [view layer]; newCaptureVideoPreviewLayer.frame = view.bounds; [viewLayer addSublayer:newCaptureVideoPreviewLayer]; self.cameraPreviewLayer = newCaptureVideoPreviewLayer; [session startRunning]; 

所以经过一些试验和错误,看着苹果的AVCam示例代码

我将PreviewLayer代码和会话startRunning包装到一个像这样的macros中央调度块中,并开始工作。

  dispatch_async(dispatch_get_main_queue(), ^{ AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session]; UIView *view = self.cameraPreviewView; CALayer *viewLayer = [view layer]; newCaptureVideoPreviewLayer.frame = view.bounds; [viewLayer addSublayer:newCaptureVideoPreviewLayer]; self.cameraPreviewLayer = newCaptureVideoPreviewLayer; [session startRunning]; }); 

这里是我的代码,它对我来说是完美的,你可以参考它

 - (void)initCapture { AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:nil]; if (!captureInput) { return; } AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init]; /* captureOutput:didOutputSampleBuffer:fromConnection delegate method !*/ [captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; [captureOutput setVideoSettings:videoSettings]; self.captureSession = [[AVCaptureSession alloc] init]; NSString* preset = 0; if (!preset) { preset = AVCaptureSessionPresetMedium; } self.captureSession.sessionPreset = preset; if ([self.captureSession canAddInput:captureInput]) { [self.captureSession addInput:captureInput]; } if ([self.captureSession canAddOutput:captureOutput]) { [self.captureSession addOutput:captureOutput]; } //handle prevLayer if (!self.captureVideoPreviewLayer) { self.captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession]; } //if you want to adjust the previewlayer frame, here! self.captureVideoPreviewLayer.frame = self.view.bounds; self.captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [self.view.layer addSublayer: self.captureVideoPreviewLayer]; [self.captureSession startRunning]; }