如何使用AVFoundation同时录制和播放捕获的video,延迟几秒钟?

我正在考虑让我的Swift iOS应用程序记录一个video并在同一屏幕上播放它,延迟30秒。

我一直在使用官方示例来录制video。 然后我添加了一个按钮,在屏幕上的单独视图中使用AVPlayer触发播放self.movieFileOutput?.outputFileURL 。 它接近我想要的但显然它会在写入磁盘的文件结束时停止播放,并且在写入下一个缓冲的块时不会继续播放。

我可以每隔30秒停止video录制并保存每个文件的URL,这样我就可以播放它,但这意味着video捕获和播放会中断。

如何让video录制从不停止,播放始终在屏幕上,我想要任何延迟?

我已经看到了类似的问题,所有的答案都指向了AVFoundation docs。 我无法找到如何让AVFoundation在录制时将可预测的video块从内存写入磁盘。

您可以通过录制30个video块来实现您想要的效果,然后将它们AVQueuePlayer进行无缝播放。 在macOS上使用AVCaptureFileOutput录制video块非常容易,但遗憾的是,在iOS上你不能在不丢帧的情况下创建新的块,所以你必须使用更低级别的AVAssetWriter API:

 import UIKit import AVFoundation // TODO: delete old videos // TODO: audio class ViewController: UIViewController { // capture let captureSession = AVCaptureSession() // playback let player = AVQueuePlayer() var playerLayer: AVPlayerLayer! = nil // output. sadly not AVCaptureMovieFileOutput var assetWriter: AVAssetWriter! = nil var assetWriterInput: AVAssetWriterInput! = nil var chunkNumber = 0 var chunkStartTime: CMTime! = nil var chunkOutputURL: URL! = nil override func viewDidLoad() { super.viewDidLoad() playerLayer = AVPlayerLayer(player: player) view.layer.addSublayer(playerLayer) // inputs let videoCaptureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo) let videoInput = try! AVCaptureDeviceInput(device: videoCaptureDevice) captureSession.addInput(videoInput) // outputs // iOS AVCaptureFileOutput/AVCaptureMovieFileOutput still don't support dynamically // switching files (?) so we have to re-implement with AVAssetWriter let videoOutput = AVCaptureVideoDataOutput() // TODO: probably something else videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main) captureSession.addOutput(videoOutput) captureSession.startRunning() } override func viewDidLayoutSubviews() { super.viewDidLayoutSubviews() playerLayer.frame = view.layer.bounds } func createWriterInput(for presentationTimeStamp: CMTime) { let fileManager = FileManager.default chunkOutputURL = fileManager.urls(for: .documentDirectory, in: .userDomainMask)[0].appendingPathComponent("chunk\(chunkNumber).mov") try? fileManager.removeItem(at: chunkOutputURL) assetWriter = try! AVAssetWriter(outputURL: chunkOutputURL, fileType: AVFileTypeQuickTimeMovie) // TODO: get dimensions from image CMSampleBufferGetImageBuffer(sampleBuffer) let outputSettings: [String: Any] = [AVVideoCodecKey:AVVideoCodecH264, AVVideoWidthKey: 1920, AVVideoHeightKey: 1080] assetWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo, outputSettings: outputSettings) assetWriterInput.expectsMediaDataInRealTime = true assetWriter.add(assetWriterInput) chunkNumber += 1 chunkStartTime = presentationTimeStamp assetWriter.startWriting() assetWriter.startSession(atSourceTime: chunkStartTime) } } extension ViewController: AVCaptureVideoDataOutputSampleBufferDelegate { func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) { let presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) if assetWriter == nil { createWriterInput(for: presentationTimeStamp) } else { let chunkDuration = CMTimeGetSeconds(CMTimeSubtract(presentationTimeStamp, chunkStartTime)) if chunkDuration > 30 { assetWriter.endSession(atSourceTime: presentationTimeStamp) // make a copy, as finishWriting is asynchronous let newChunkURL = chunkOutputURL! let chunkAssetWriter = assetWriter! chunkAssetWriter.finishWriting { print("finishWriting says: \(chunkAssetWriter.status.rawValue, chunkAssetWriter.error)") print("queuing \(newChunkURL)") self.player.insert(AVPlayerItem(url: newChunkURL), after: nil) self.player.play() } createWriterInput(for: presentationTimeStamp) } } if !assetWriterInput.append(sampleBuffer) { print("append says NO: \(assetWriter.status.rawValue, assetWriter.error)") } } } 

ps很高兴看到你在30秒前做了什么。 你究竟在做什么?