将CMSampleBuffer转换为UIImage

这是一个将CMSampleBuffer转换为UIImage的函数(来自Apple文档的代码)

func imageFromSampleBuffer(sampleBuffer: CMSampleBuffer) -> UIImage { // Get a CMSampleBuffer's Core Video image buffer for the media data var imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer, 0) // Get the number of bytes per row for the pixel buffer var baseAddress = CVPixelBufferGetBaseAddress(imageBuffer) // Get the number of bytes per row for the pixel buffer var bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer) // Get the pixel buffer width and height var width = CVPixelBufferGetWidth(imageBuffer) var height = CVPixelBufferGetHeight(imageBuffer) // Create a device-dependent RGB color space var colorSpace = CGColorSpaceCreateDeviceRGB() // Create a bitmap graphics context with the sample buffer data let bitmapInfo = CGBitmapInfo(CGImageAlphaInfo.NoneSkipLast.rawValue) var context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, bitmapInfo) // Create a Quartz image from the pixel data in the bitmap graphics context var quartzImage = CGBitmapContextCreateImage(context); // Unlock the pixel buffer CVPixelBufferUnlockBaseAddress(imageBuffer,0); // Create an image object from the Quartz image var image = UIImage(CGImage: quartzImage)! return image } 

当我尝试使用UIImageView可视化UIImage时,我什么都没得到。
有任何想法吗?

这是Swift 3.0的解决方案,其中CMSampleBuffer被扩展,创建一个变量,为您提供可选的UIImage

 import AVFoundation extension CMSampleBuffer { var uiImage: UIImage? { guard let imageBuffer = CMSampleBufferGetImageBuffer(self) else { return nil } CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0)) let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer) let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer) let width = CVPixelBufferGetWidth(imageBuffer) let height = CVPixelBufferGetHeight(imageBuffer) let colorSpace = CGColorSpaceCreateDeviceRGB() let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.noneSkipFirst.rawValue | CGBitmapInfo.byteOrder32Little.rawValue) guard let context = CGContext(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo.rawValue) else { return nil } guard let cgImage = context.makeImage() else { return nil } CVPixelBufferUnlockBaseAddress(imageBuffer,CVPixelBufferLockFlags(rawValue: 0)); return UIImage(cgImage: cgImage) } } 

我刚刚完成了当前项目中完全相同的function,这就是我如何使用它(使用大量的Google搜索和一些试错):

 let bitmapInfo = CGBitmapInfo(CGImageAlphaInfo.NoneSkipFirst.rawValue | CGBitmapInfo.ByteOrder32Little.rawValue) 

另外,请确保在主线程中显示UIImageView(您可能在相机会话线程中获取CMSampleBuffer),因为UIKit只能在主线程中执行。 否则,您将不得不等待很长时间才能显示图像。

@Zigglzworth需要为captureVideoDataOutput设置kCVPixelFormatType_32BGRA

  let videoOutput = AVCaptureVideoDataOutput() videoOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable:kCVPixelFormatType_32BGRA]