捕获金属MTKView作为电影实时?

什么是从MTKView捕获帧的最有效的方法? 如果可能的话,我想实时保存帧中的.mov文件。 是否有可能渲染成一个AVPlayer框架或东西?

目前正在使用此代码(基于@warrenm PerformanceShaders 项目 )进行绘制:

 func draw(in view: MTKView) { _ = inflightSemaphore.wait(timeout: DispatchTime.distantFuture) updateBuffers() let commandBuffer = commandQueue.makeCommandBuffer() commandBuffer.addCompletedHandler{ [weak self] commandBuffer in if let strongSelf = self { strongSelf.inflightSemaphore.signal() } } // Dispatch the current kernel to perform the selected image filter selectedKernel.encode(commandBuffer: commandBuffer, sourceTexture: kernelSourceTexture!, destinationTexture: kernelDestTexture!) if let renderPassDescriptor = view.currentRenderPassDescriptor, let currentDrawable = view.currentDrawable { let clearColor = MTLClearColor(red: 0, green: 0, blue: 0, alpha: 1) renderPassDescriptor.colorAttachments[0].clearColor = clearColor let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) renderEncoder.label = "Main pass" renderEncoder.pushDebugGroup("Draw textured square") renderEncoder.setFrontFacing(.counterClockwise) renderEncoder.setCullMode(.back) renderEncoder.setRenderPipelineState(pipelineState) renderEncoder.setVertexBuffer(vertexBuffer, offset: MBEVertexDataSize * bufferIndex, at: 0) renderEncoder.setVertexBuffer(uniformBuffer, offset: MBEUniformDataSize * bufferIndex , at: 1) renderEncoder.setFragmentTexture(kernelDestTexture, at: 0) renderEncoder.setFragmentSamplerState(sampler, at: 0) renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4) renderEncoder.popDebugGroup() renderEncoder.endEncoding() commandBuffer.present(currentDrawable) } bufferIndex = (bufferIndex + 1) % MBEMaxInflightBuffers commandBuffer.commit() } 

这里有一个小类,它执行写出一个捕获Metal视图内容的电影文件的基本function:

 class MetalVideoRecorder { var isRecording = false var recordingStartTime = TimeInterval(0) private var assetWriter: AVAssetWriter private var assetWriterVideoInput: AVAssetWriterInput private var assetWriterPixelBufferInput: AVAssetWriterInputPixelBufferAdaptor init?(outputURL url: URL, size: CGSize) { do { assetWriter = try AVAssetWriter(outputURL: url, fileType: AVFileTypeAppleM4V) } catch { return nil } let outputSettings: [String: Any] = [ AVVideoCodecKey : AVVideoCodecH264, AVVideoWidthKey : size.width, AVVideoHeightKey : size.height ] assetWriterVideoInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings) assetWriterVideoInput.expectsMediaDataInRealTime = true let sourcePixelBufferAttributes: [String: Any] = [ kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA, kCVPixelBufferWidthKey as String : size.width, kCVPixelBufferHeightKey as String : size.height ] assetWriterPixelBufferInput = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: assetWriterVideoInput, sourcePixelBufferAttributes: sourcePixelBufferAttributes) assetWriter.add(assetWriterVideoInput) } func startRecording() { assetWriter.startWriting() assetWriter.startSession(atSourceTime: kCMTimeZero) recordingStartTime = CACurrentMediaTime() isRecording = true } func endRecording(_ completionHandler: @escaping () -> ()) { isRecording = false assetWriterVideoInput.markAsFinished() assetWriter.finishWriting(completionHandler: completionHandler) } func writeFrame(forTexture texture: MTLTexture) { if !isRecording { return } while !assetWriterVideoInput.isReadyForMoreMediaData {} guard let pixelBufferPool = assetWriterPixelBufferInput.pixelBufferPool else { print("Pixel buffer asset writer input did not have a pixel buffer pool available; cannot retrieve frame") return } var maybePixelBuffer: CVPixelBuffer? = nil let status = CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &maybePixelBuffer) if status != kCVReturnSuccess { print("Could not get pixel buffer from asset writer input; dropping frame...") return } guard let pixelBuffer = maybePixelBuffer else { return } CVPixelBufferLockBaseAddress(pixelBuffer, []) let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer)! // Use the bytes per row value from the pixel buffer since its stride may be rounded up to be 16-byte aligned let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer) let region = MTLRegionMake2D(0, 0, texture.width, texture.height) texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0) let frameTime = CACurrentMediaTime() - recordingStartTime let presentationTime = CMTimeMakeWithSeconds(frameTime, 240) assetWriterPixelBufferInput.append(pixelBuffer, withPresentationTime: presentationTime) CVPixelBufferUnlockBaseAddress(pixelBuffer, []) } } 

初始化其中之一并调用startRecording() ,可以将一个预定处理程序添加到包含渲染命令的命令缓冲区,并调用writeFrame (在编码结束之后,但是在呈现drawable或提交缓冲区之前):

 let texture = currentDrawable.texture commandBuffer.addScheduledHandler { commandBuffer in self.recorder.writeFrame(forTexture: texture) } 

当您完成录制时,只需调用endRecording ,video文件将完成并closures。

注意事项

这个类假定源纹理是默认格式, .bgra8Unorm 。 如果不是,你会得到崩溃或腐败。 如有必要,请使用计算或片段着色器转换纹理,或使用“加速”。

这个类还假设纹理和video帧的大小相同。 如果情况并非如此(如果可绘制大小发生变化,或者屏幕自动旋转),则输出将被破坏,您可能会看到崩溃。 按照您的应用程序的要求缩放或裁剪源纹理,从而缓解这种情况。