无法使用AVCapturePhotoOutput捕捉照片swift + xcode

我正在一个自定义相机应用程序和教程使用AVCaptureStillImageOutput,这是不赞成ios 10.我已经设置了相机,我现在卡在如何拍照

这里是我充分的看法,我有相机

import UIKit import AVFoundation var cameraPos = "back" class View3: UIViewController,UIImagePickerControllerDelegate,UINavigationControllerDelegate { @IBOutlet weak var clickButton: UIButton! @IBOutlet var cameraView: UIView! var session: AVCaptureSession? var stillImageOutput: AVCapturePhotoOutput? var videoPreviewLayer: AVCaptureVideoPreviewLayer? override func viewDidLoad() { super.viewDidLoad() } override func didReceiveMemoryWarning() { super.didReceiveMemoryWarning() } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) clickButton.center.x = cameraView.bounds.width/2 loadCamera() } override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) } @IBAction func clickCapture(_ sender: UIButton) { if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) { // This is where I need help } } @IBAction func changeDevice(_ sender: UIButton) { if cameraPos == "back" {cameraPos = "front"} else {cameraPos = "back"} loadCamera() } func loadCamera() { session?.stopRunning() videoPreviewLayer?.removeFromSuperlayer() session = AVCaptureSession() session!.sessionPreset = AVCaptureSessionPresetPhoto var backCamera = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera, mediaType: AVMediaTypeVideo, position: .front) if cameraPos == "back" { backCamera = AVCaptureDevice.defaultDevice(withDeviceType: .builtInWideAngleCamera, mediaType: AVMediaTypeVideo, position: .back) } var error: NSError? var input: AVCaptureDeviceInput! do { input = try AVCaptureDeviceInput(device: backCamera) } catch let error1 as NSError { error = error1 input = nil print(error!.localizedDescription) } if error == nil && session!.canAddInput(input) { session!.addInput(input) stillImageOutput = AVCapturePhotoOutput() if session!.canAddOutput(stillImageOutput) { session!.addOutput(stillImageOutput) videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session) videoPreviewLayer?.frame = cameraView.bounds videoPreviewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill videoPreviewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait cameraView.layer.addSublayer(videoPreviewLayer!) session!.startRunning() } } } } 

这是我需要帮助的地方

 @IBAction func clickCapture(_ sender: UIButton) { if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) { // This is where I need help } } 

我已经经历了这里的答案如何使用AVCapturePhotoOutput,但我不明白如何将代码合并在此代码中,因为它涉及到声明一个新的类

你几乎在那里。

作为AVCapturePhotoOutput输出

查看AVCapturePhotoOutput 文档以获取更多帮助。

这些是拍摄照片的步骤。

  1. 创build一个AVCapturePhotoOutput对象。 使用其属性确定支持的捕捉设置并启用某些function(例如,是否捕捉实时照片)。
  2. 创build并configurationAVCapturePhotoSettings对象,以便为特定捕捉(例如,启用图像稳定或闪光灯)selectfunction和设置。
  3. 通过将您的照片设置对象传递给capturePhoto(with:delegate:)方法以及实现AVCapturePhotoCaptureDelegate协议的委托对象来捕获图像。 照片捕捉输出然后调用您的委托在捕获过程中通知您重大事件。

在您的clickCapture方法上有以下代码,并且不要忘记确认并在您的class级中进行委派。

 let settings = AVCapturePhotoSettings() let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first! let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType, kCVPixelBufferWidthKey as String: 160, kCVPixelBufferHeightKey as String: 160, ] settings.previewPhotoFormat = previewFormat self.cameraOutput.capturePhoto(with: settings, delegate: self) 

输出为AVCaptureStillImageOutput

如果你打算拍摄video连接的照片。 你可以按照下面的步骤。

第1步:获取连接

 if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) { // ... // Code for photo capture goes here... } 

第2步:拍摄照片

  • stillImageOutput上调用captureStillImageAsynchronouslyFromConnection函数。
  • sampleBuffer代表捕获的数据。

 stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in // ... // Process the image data (sampleBuffer) here to get an image file we can put in our captureImageView }) 

第3步:处理图像数据

  • 我们将需要采取一些步骤来处理sampleBuffer中的图像数据,以便最终生成一个UIImage,我们可以将其插入到我们的CaptureImageView中,并轻松地在我们的应用程序的其他地方使用。

 if sampleBuffer != nil { let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer) let dataProvider = CGDataProviderCreateWithCFData(imageData) let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault) let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right) // ... // Add the image to captureImageView here... } 

第4步:保存图像

根据您的需要将图像保存到照片库或在图像视图中显示


有关更多详细信息,请查看在“ 拍摄 照片”下创build自定义相机视图指南