Swift 3中的AVCaptureStillImageOutput与AVCapturePhotoOutput

我想简单地把我的视图控制器中的相机视图。

我在顶部导入了AVFoundation ,以及UIImagePickerControllerDelegateUINavigationControllerDelegate类。

然而,每当我尝试使用AVCaptureStillImageOutput ,Xcode告诉我,它已被弃用iOS10,我应该使用AVCapturePhotoOutput 。 这是完全好的,但是,只要我想调用stillImageOutput.outputSettings.outputSettings本身不可用。 因此,我不得不使用AVAVCaptureStillImageOutput它的工作,但我有多个警告,因为这个function已被弃用iOS10。

我search和search,但无法真正find解决scheme。 我将衷心感谢您的帮助。 我正在学习,所以任何解释都会很棒! 代码如下。

 import UIKit import AVFoundation class CameraView: UIViewController, UIImagePickerControllerDelegate, UINavigationControllerDelegate { var captureSession : AVCaptureSession? var stillImageOutput : AVCaptureStillImageOutput? var previewLayer : AVCaptureVideoPreviewLayer? @IBOutlet var cameraView: UIView! override func viewWillAppear(_ animated: Bool) { super.viewWillAppear(animated) captureSession = AVCaptureSession() captureSession?.sessionPreset = AVCaptureSessionPreset1920x1080 var backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo) var error : NSError? do { var input = try! AVCaptureDeviceInput (device: backCamera) if (error == nil && captureSession?.canAddInput(input) != nil) { captureSession?.addInput(input) stillImageOutput = AVCaptureStillImageOutput() stillImageOutput?.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG] if (captureSession?.canAddOutput(stillImageOutput) != nil) { captureSession?.addOutput(stillImageOutput) previewLayer = AVCaptureVideoPreviewLayer (session: captureSession) previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait cameraView.layer.addSublayer(previewLayer!) captureSession?.startRunning() } } } catch { } } } 

被弃用的AVCaptureStillImageOutput意味着你可以继续在iOS 10中使用它,但是:

  • 苹果没有承诺过去的iOS 10将保持可用的时间。
  • 随着在iOS 10及更高版本中添加新的硬件和软件function,您将无法访问所有这些function。 例如,您可以AVCaptureStillImageOutput设置为较宽的颜色,但使用AVCaptureStillImageOutput进行较宽的颜色要容易AVCapturePhotoOutput 。 而对于RAW捕捉或Live照片, AVCapturePhotoOutput是镇上唯一的游戏。

如果尽pipe弃用仍然很高兴,那么您的问题不是取消了outputSettings – 它仍然存在 。

有些东西需要注意[AnyHashable: Any]以及更高版本(虽然事实certificate这不是一个问题):使用NSDictionary而没有显式键和值types的API作为[AnyHashable: Any]进入Swift 3,Foundation或者CoreFoundationtypes可能在字典中使用不再隐式地桥接到Swifttypes。 ( 有关testing版6字典转换的其他一些问题可能会使您指向正确的方向。)

但是,我没有得到任何设置outputSettings编译错误。 无论是在完整的代码中,还是将其减less到该行的重要部分:

 var stillImageOutput : AVCaptureStillImageOutput? stillImageOutput = AVCaptureStillImageOutput() stillImageOutput?.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG] 

我所看到的唯一的警告就是贬值。

我有充分的实施

 import UIKit import AVFoundation class ViewController: UIViewController, AVCapturePhotoCaptureDelegate { var captureSesssion : AVCaptureSession! var cameraOutput : AVCapturePhotoOutput! var previewLayer : AVCaptureVideoPreviewLayer! @IBOutlet weak var capturedImage: UIImageView! @IBOutlet weak var previewView: UIView! override func viewDidLoad() { super.viewDidLoad() captureSesssion = AVCaptureSession() captureSesssion.sessionPreset = AVCaptureSessionPresetPhoto cameraOutput = AVCapturePhotoOutput() let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo) if let input = try? AVCaptureDeviceInput(device: device) { if (captureSesssion.canAddInput(input)) { captureSesssion.addInput(input) if (captureSesssion.canAddOutput(cameraOutput)) { captureSesssion.addOutput(cameraOutput) previewLayer = AVCaptureVideoPreviewLayer(session: captureSesssion) previewLayer.frame = previewView.bounds previewView.layer.addSublayer(previewLayer) captureSesssion.startRunning() } } else { print("issue here : captureSesssion.canAddInput") } } else { print("some problem here") } } // Take picture button @IBAction func didPressTakePhoto(_ sender: UIButton) { let settings = AVCapturePhotoSettings() let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first! let previewFormat = [ kCVPixelBufferPixelFormatTypeKey as String: previewPixelType, kCVPixelBufferWidthKey as String: 160, kCVPixelBufferHeightKey as String: 160 ] settings.previewPhotoFormat = previewFormat cameraOutput.capturePhoto(with: settings, delegate: self) } // callBack from take picture func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) { if let error = error { print("error occure : \(error.localizedDescription)") } if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) { print(UIImage(data: dataImage)?.size as Any) let dataProvider = CGDataProvider(data: dataImage as CFData) let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent) let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right) self.capturedImage.image = image } else { print("some error here") } } // This method you can use somewhere you need to know camera permission state func askPermission() { print("here") let cameraPermissionStatus = AVCaptureDevice.authorizationStatus(forMediaType: AVMediaTypeVideo) switch cameraPermissionStatus { case .authorized: print("Already Authorized") case .denied: print("denied") let alert = UIAlertController(title: "Sorry :(" , message: "But could you please grant permission for camera within device settings", preferredStyle: .alert) let action = UIAlertAction(title: "Ok", style: .cancel, handler: nil) alert.addAction(action) present(alert, animated: true, completion: nil) case .restricted: print("restricted") default: AVCaptureDevice.requestAccess(forMediaType: AVMediaTypeVideo, completionHandler: { [weak self] (granted :Bool) -> Void in if granted == true { // User granted print("User granted") DispatchQueue.main.async(){ //Do smth that you need in main thread } } else { // User Rejected print("User Rejected") DispatchQueue.main.async(){ let alert = UIAlertController(title: "WHY?" , message: "Camera it is the main feature of our application", preferredStyle: .alert) let action = UIAlertAction(title: "Ok", style: .cancel, handler: nil) alert.addAction(action) self?.present(alert, animated: true, completion: nil) } } }); } } }