最有效/实时的方式来从iOS摄像头提供像素值在Swift中

这里有一些关于类似问题的讨论。 像这样 ,但是他们看起来很过时,所以我想我会在这里问。

我想从swift 2.0的摄像头获取接近实时的RGB像素值,甚至更好的全图像RGB直方图。 我希望这是尽可能快和最新(理想情况下~30 fps或更高)

我可以从AVCaptureVideoPreviewLayer直接得到这个,或者我需要捕获每帧(asynchronous,我假设,如果过程需要大量的时间),然后从jpeg / png渲染提取像素值?

一些示例代码,取自jquave,但为swift 2.0进行了修改

 import UIKit import AVFoundation class ViewController: UIViewController { let captureSession = AVCaptureSession() var previewLayer : AVCaptureVideoPreviewLayer? var captureDevice : AVCaptureDevice? override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view, typically from a nib. captureSession.sessionPreset = AVCaptureSessionPresetHigh let devices = AVCaptureDevice.devices() // Loop through all the capture devices on this phone for device in devices { // Make sure this particular device supports video if (device.hasMediaType(AVMediaTypeVideo)) { // Finally check the position and confirm we've got the back camera if(device.position == AVCaptureDevicePosition.Back) { captureDevice = device as? AVCaptureDevice if captureDevice != nil { print("Capture device found") beginSession() } } } } } func focusTo(value : Float) { if let device = captureDevice { do { try device.lockForConfiguration() device.setFocusModeLockedWithLensPosition(value, completionHandler: { (time) -> Void in }) device.unlockForConfiguration() } catch { //error message print("Can't change focus of capture device") } } } func configureDevice() { if let device = captureDevice { do { try device.lockForConfiguration() device.focusMode = .Locked device.unlockForConfiguration() } catch { //error message etc. print("Capture device not configurable") } } } func beginSession() { configureDevice() do { //try captureSession.addInput(input: captureDevice) try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice)) updateDeviceSettings(0.0, isoValue: 0.0) } catch { //error message etc. print("Capture device not initialisable") } previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) self.view.layer.addSublayer(previewLayer!) previewLayer?.frame = self.view.layer.frame captureSession.startRunning() } func updateDeviceSettings(focusValue : Float, isoValue : Float) { if let device = captureDevice { do { try device.lockForConfiguration() device.setFocusModeLockedWithLensPosition(focusValue, completionHandler: { (time) -> Void in // }) // Adjust the iso to clamp between minIso and maxIso based on the active format let minISO = device.activeFormat.minISO let maxISO = device.activeFormat.maxISO let clampedISO = isoValue * (maxISO - minISO) + minISO device.setExposureModeCustomWithDuration(AVCaptureExposureDurationCurrent, ISO: clampedISO, completionHandler: { (time) -> Void in // }) device.unlockForConfiguration() } catch { //error message etc. print("Can't update device settings") } } } } 

你不想要一个AVCaptureVideoPreviewLayer – 这就是你想要的,如果你想要显示的video。 相反,你需要一个不同的输出:AVCaptureVideoDataOutput:

https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVCaptureVideoDataOutput_Class/index.html#//apple_ref/occ/cl/AVCaptureVideoDataOutput

这样可以直接访问样本缓冲区,然后可以进入像素空间。

请注意:我不知道当前设备的吞吐量是多less,但是我无法从iPhone 4S获得最高质量的实时stream,因为GPU的CPUstream水线太慢了。