自定义AVVideoCompositing类不按预期方式工作

我正在尝试将CIFilter应用于AVAsset,然后使用应用的filter进行保存。 我这样做的方式是通过使用videoCompositionAVMutableVideoComposition设置为具有自定义AVVideoCompositing类的AVMutableVideoComposition对象。

我还将AVMutableVideoComposition对象的instructions设置为自定义组合指令类(符合AVMutableVideoCompositionInstruction )。 这个类传递一个跟踪ID,以及一些其他不重要的variables。

不幸的是,我遇到了一个问题 – 我的自定义video合成器类(符合AVVideoCompositing )中的startVideoCompositionRequest:函数未被正确调用。

当我将自定义指令类的passthroughTrackIDvariables设置为轨道ID时, startVideoCompositionRequest(request)函数不会被调用。

然而,当我没有设置我的自定义指令类的passthroughTrackIDvariables, startVideoCompositionRequest(request) 调用,但不正确 – 打印request.sourceTrackIDs结果为一个空数组, request.sourceFrameByTrackID(trackID)结果为零值。

有趣的是,我发现cancelAllPendingVideoCompositionRequests:函数在尝试使用filter导出video时总是被调用两次。 它可以在startVideoCompositionRequest:之前被调用一次,也可以在startVideoCompositionRequest:未被调用的情况下连续两次。

我创build了三个类,用于filter导出video。 这里是实用程序类,它基本上只包含一个export函数,并调用所有必需的代码

 class VideoFilterExport{ let asset: AVAsset init(asset: AVAsset){ self.asset = asset } func export(toURL url: NSURL, callback: (url: NSURL?) -> Void){ guard let track: AVAssetTrack = self.asset.tracksWithMediaType(AVMediaTypeVideo).first else{callback(url: nil); return} let composition = AVMutableComposition() let compositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid) do{ try compositionTrack.insertTimeRange(track.timeRange, ofTrack: track, atTime: kCMTimeZero) } catch _{callback(url: nil); return} let videoComposition = AVMutableVideoComposition(propertiesOfAsset: composition) videoComposition.customVideoCompositorClass = VideoFilterCompositor.self videoComposition.frameDuration = CMTimeMake(1, 30) videoComposition.renderSize = compositionTrack.naturalSize let instruction = VideoFilterCompositionInstruction(trackID: compositionTrack.trackID) instruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.asset.duration) videoComposition.instructions = [instruction] let session: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetMediumQuality)! session.videoComposition = videoComposition session.outputURL = url session.outputFileType = AVFileTypeMPEG4 session.exportAsynchronouslyWithCompletionHandler(){ callback(url: url) } } } 

下面是另外两个类 – 我将它们放在一个代码块中,以缩短这个post

 // Video Filter Composition Instruction Class - from what I gather, // AVVideoCompositionInstruction is used only to pass values to // the AVVideoCompositing class class VideoFilterCompositionInstruction : AVMutableVideoCompositionInstruction{ let trackID: CMPersistentTrackID let filters: ImageFilterGroup let context: CIContext // When I leave this line as-is, startVideoCompositionRequest: isn't called. // When commented out, startVideoCompositionRequest(request) is called, but there // are no valid CVPixelBuffers provided by request.sourceFrameByTrackID(below value) override var passthroughTrackID: CMPersistentTrackID{get{return self.trackID}} override var requiredSourceTrackIDs: [NSValue]{get{return []}} override var containsTweening: Bool{get{return false}} init(trackID: CMPersistentTrackID, filters: ImageFilterGroup, context: CIContext){ self.trackID = trackID self.filters = filters self.context = context super.init() //self.timeRange = timeRange self.enablePostProcessing = true } required init?(coder aDecoder: NSCoder) { fatalError("init(coder:) has not been implemented") } } // My custom AVVideoCompositing class. This is where the problem lies - // although I don't know if this is the root of the problem class VideoFilterCompositor : NSObject, AVVideoCompositing{ var requiredPixelBufferAttributesForRenderContext: [String : AnyObject] = [ kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA), // The video is in 32 BGRA kCVPixelBufferOpenGLESCompatibilityKey as String : NSNumber(bool: true), kCVPixelBufferOpenGLCompatibilityKey as String : NSNumber(bool: true) ] var sourcePixelBufferAttributes: [String : AnyObject]? = [ kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA), kCVPixelBufferOpenGLESCompatibilityKey as String : NSNumber(bool: true), kCVPixelBufferOpenGLCompatibilityKey as String : NSNumber(bool: true) ] let renderQueue = dispatch_queue_create("co.getblix.videofiltercompositor.renderingqueue", DISPATCH_QUEUE_SERIAL) override init(){ super.init() } func startVideoCompositionRequest(request: AVAsynchronousVideoCompositionRequest){ // This code block is never executed when the // passthroughTrackID variable is in the above class autoreleasepool(){ dispatch_async(self.renderQueue){ guard let instruction = request.videoCompositionInstruction as? VideoFilterCompositionInstruction else{ request.finishWithError(NSError(domain: "getblix.co", code: 760, userInfo: nil)) return } guard let pixels = request.sourceFrameByTrackID(instruction.passthroughTrackID) else{ // This code block is executed when I comment out the // passthroughTrackID variable in the above class request.finishWithError(NSError(domain: "getblix.co", code: 761, userInfo: nil)) return } // I have not been able to get the code to reach this point // This function is either not called, or the guard // statement above executes let image = CIImage(CVPixelBuffer: pixels) let filtered: CIImage = //apply the filter here let width = CVPixelBufferGetWidth(pixels) let height = CVPixelBufferGetHeight(pixels) let format = CVPixelBufferGetPixelFormatType(pixels) var newBuffer: CVPixelBuffer? CVPixelBufferCreate(kCFAllocatorDefault, width, height, format, nil, &newBuffer) if let buffer = newBuffer{ instruction.context.render(filtered, toCVPixelBuffer: buffer) request.finishWithComposedVideoFrame(buffer) } else{ request.finishWithComposedVideoFrame(pixels) } } } } func renderContextChanged(newRenderContext: AVVideoCompositionRenderContext){ // I don't have any code in this block } // This is interesting - this is called twice, // Once before startVideoCompositionRequest is called, // And once after. In the case when startVideoCompositionRequest // Is not called, this is simply called twice in a row func cancelAllPendingVideoCompositionRequests(){ dispatch_barrier_async(self.renderQueue){ print("Cancelled") } } } 

我一直在寻找苹果的AVCustomEdit示例项目来获得这方面的指导,但是我似乎无法find这种情况发生的原因。

我怎么能得到request.sourceFrameByTrackID:函数调用正确,并提供一个有效的CVPixelBuffer每帧?

事实certificate,自定义AVVideoCompositionInstruction类( VideoFilterCompositionInstruction中的VideoFilterCompositionInstruction中的requiredSourceTrackIDsvariables必须设置为包含音轨ID的数组

 override var requiredSourceTrackIDs: [NSValue]{ get{ return [ NSNumber(value: Int(self.trackID)) ] } } 

所以最终的自定义作文指导课是

 class VideoFilterCompositionInstruction : AVMutableVideoCompositionInstruction{ let trackID: CMPersistentTrackID let filters: [CIFilter] let context: CIContext override var passthroughTrackID: CMPersistentTrackID{get{return self.trackID}} override var requiredSourceTrackIDs: [NSValue]{get{return [NSNumber(value: Int(self.trackID))]}} override var containsTweening: Bool{get{return false}} init(trackID: CMPersistentTrackID, filters: [CIFilter], context: CIContext){ self.trackID = trackID self.filters = filters self.context = context super.init() self.enablePostProcessing = true } required init?(coder aDecoder: NSCoder){ fatalError("init(coder:) has not been implemented") } } 

这个工具的所有代码也在GitHub上

正如你所说, passthroughTrackID返回你想过滤的轨道是不正确的方法 – 你需要返回从requiredSourceTrackIDs被过滤的轨道,而不是。 (它看起来像一旦你这样做,没关系,如果你passthroughTrackID返回它。)要回答为什么它这样工作的剩余问题…

passthroughTrackIDrequiredSourceTrackIDs的文档当然不是苹果有史以来最清晰的文字。 (提出一个错误 ,他们可能会改善。)但是,如果你仔细看前者的描述,有一个提示(强调增加)…

如果在指令的持续时间内,video合成结果是源帧之一,则该属性返回相应的轨道ID。 合成器在指令持续时间内不会运行,而是使用适当的源代码框。

所以, 只有当你使一个指令类通过一个单独的轨道而不经过处理时,你才使用passthroughTrackID

如果您打算执行任何image processing,即使只是在没有合成的单个轨道上,也应该在requiredSourceTrackIDs指定该轨道。