AVCapture会话捕获图像SWIFT

我创build了一个AVCaptureSession来捕获video输出并通过UIView将其显示给用户。 现在我想能够点击一个button(takePhoto方法),并显示在UIImageView中的会话的图像。 我试图遍历每个设备连接,并尝试保存输出,但没有奏效。 我的代码如下

let captureSession = AVCaptureSession() var stillImageOutput: AVCaptureStillImageOutput! @IBOutlet var imageView: UIImageView! @IBOutlet var cameraView: UIView! // If we find a device we'll store it here for later use var captureDevice : AVCaptureDevice? override func viewDidLoad() { // Do any additional setup after loading the view, typically from a nib. super.viewDidLoad() println("I AM AT THE CAMERA") captureSession.sessionPreset = AVCaptureSessionPresetLow self.captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo) if(captureDevice != nil){ beginSession() } } func beginSession() { self.stillImageOutput = AVCaptureStillImageOutput() self.captureSession.addOutput(self.stillImageOutput) var err : NSError? = nil self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err)) if err != nil { println("error: \(err?.localizedDescription)") } var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession) self.cameraView.layer.addSublayer(previewLayer) previewLayer?.frame = self.cameraView.layer.frame captureSession.startRunning() } @IBAction func takePhoto(sender: UIButton) { self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)) { (buffer:CMSampleBuffer!, error:NSError!) -> Void in var image = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer) var data_image = UIImage(data: image) self.imageView.image = data_image } } } 

添加input和输出到会话之前,您应该尝试添加一个新的线程,然后再启动它。 在苹果的文档中,他们表示

重要:startRunning方法是一个阻塞调用,可能需要一些时间,因此您应该在串行队列上执行会话设置,以便主队列不被阻塞(这保持了UI的响应)。 有关规范实现示例,请参阅iOS的AVCam。

尝试在下面的创build会话方法中使用分派

 dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), { // 1 self.captureSession.addOutput(self.stillImageOutput) self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err)) self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto if err != nil { println("error: \(err?.localizedDescription)") } var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession) previewLayer?.frame = self.cameraView.layer.bounds previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill dispatch_async(dispatch_get_main_queue(), { // 2 // 3 self.cameraView.layer.addSublayer(previewLayer) self.captureSession.startRunning() }); });