与自定义相机Swift 3拍照

在Swift 2.3中,我使用这个代码在自定义相机中拍照:

func didPressTakePhoto(){ if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) { stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in if sampleBuffer != nil { let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer) let dataProvider = CGDataProviderCreateWithCFData(imageData) let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault) let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right) self.captureImageView.image = image } }) } } 

但他的行: stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in

显示这个错误:

types'AVCapturePhotoOutput'的值没有成员'captureStillImageAsynchronouslyFromConnection'

我试着解决我的问题,但我总是得到越来越多的错误,所以这就是为什么我张贴我的原代码。

有人知道如何使我的代码再次工作?

谢谢。

你可以像这样在Swift 3中使用AVCapturePhotoOutput

您需要返回CMSampleBufferAVCapturePhotoCaptureDelegate

如果您告诉AVCapturePhotoSettings的previewFormat,您也可以获得预览图像

 class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate { let cameraOutput = AVCapturePhotoOutput() func capturePhoto() { let settings = AVCapturePhotoSettings() let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first! let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType, kCVPixelBufferWidthKey as String: 160, kCVPixelBufferHeightKey as String: 160, ] settings.previewPhotoFormat = previewFormat self.cameraOutput.capturePhoto(with: settings, delegate: self) } func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: NSError?) { if let error = error { print(error.localizedDescription) } if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) { print(image: UIImage(data: dataImage).size) } else { } } } 

感谢Sharpkits,我find了我的解决scheme(此代码适用于我):

 func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) { if let error = error { print(error.localizedDescription) } if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) { let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: nil) let dataProvider = CGDataProvider(data: imageData as! CFData) let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: CGColorRenderingIntent.absoluteColorimetric) let image = UIImage(cgImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.right) let cropedImage = self.cropToSquare(image: image) let newImage = self.scaleImageWith(cropedImage, and: CGSize(width: 600, height: 600)) print(UIScreen.main.bounds.width) self.tempImageView.image = newImage self.tempImageView.isHidden = false } else { } } 

伟大的代码。 非常感谢帮助和例子。

为了澄清像我这样的智力较差的人,捕捉(_ …等)方法在幕后调用self.ameraOutput.capturePhoto(with:settings,delegate:self)在您的takePhoto (或不pipe你怎么说)方法。 你永远不会直接调用捕获方法。 它自动发生。