使用AVFoundation叠加两个video

我试图覆盖两个video,前景video有些透明。 我一直在关注Apple Docs以及本教程 。

每当我尝试通过我的代码放入两个相同的video时它就不会崩溃; 但是,当我尝试喂它两个不同的video时,我收到此错误:

VideoMaskingUtils.exportVideo Error: Optional(Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}) VideoMaskingUtils.exportVideo Description: <AVAssetExportSession: 0x1556be30, asset = <AVMutableComposition: 0x15567f10 tracks = ( "", "" )>, presetName = AVAssetExportPresetHighestQuality, outputFileType = public.mpeg-4 Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.} 

我知道您无法在iOS上保存带有Alpha通道的video – 我想将这两个video压缩为一个不透明的video。

当尝试重叠两个video并使用CATransforms应用PiP样式时,它会崩溃; 简单地重叠它们(没有alpha或任何其他效果应用工作)任何帮助表示赞赏。

这是我的代码(包含两种方法):

 class func overlay(video firstAsset: AVURLAsset, withSecondVideo secondAsset: AVURLAsset, andAlpha alpha: Float) { let mixComposition = AVMutableComposition() let firstTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid) let secondTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid) guard let firstMediaTrack = firstAsset.tracksWithMediaType(AVMediaTypeVideo).first else { return } guard let secondMediaTrack = secondAsset.tracksWithMediaType(AVMediaTypeVideo).first else { return } do { try firstTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, firstAsset.duration), ofTrack: firstMediaTrack, atTime: kCMTimeZero) try secondTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, secondAsset.duration), ofTrack: secondMediaTrack, atTime: kCMTimeZero) } catch (let error) { print(error) } let width = max(firstMediaTrack.naturalSize.width, secondMediaTrack.naturalSize.width) let height = max(firstMediaTrack.naturalSize.height, secondMediaTrack.naturalSize.height) let videoComposition = AVMutableVideoComposition() videoComposition.renderSize = CGSizeMake(width, height) videoComposition.frameDuration = firstMediaTrack.minFrameDuration let firstApproach = false if firstApproach { let mainInstruction = AVMutableVideoCompositionInstruction() mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration) mainInstruction.backgroundColor = UIColor.redColor().CGColor let firstlayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: firstTrack) firstlayerInstruction.setTransform(firstAsset.preferredTransform, atTime: kCMTimeZero) let secondInstruction = AVMutableVideoCompositionInstruction() secondInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, secondAsset.duration) let backgroundColor = UIColor(colorLiteralRed: 1.0, green: 1.0, blue: 1.0, alpha: alpha) secondInstruction.backgroundColor = backgroundColor.CGColor let secondlayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: secondTrack) secondlayerInstruction.setTransform(secondAsset.preferredTransform, atTime: kCMTimeZero) secondInstruction.layerInstructions = [secondlayerInstruction] mainInstruction.layerInstructions = [firstlayerInstruction]//, secondlayerInstruction] videoComposition.instructions = [mainInstruction, secondInstruction] } else { let firstLayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: firstMediaTrack) firstLayerInstruction.setTransform(firstMediaTrack.preferredTransform, atTime: kCMTimeZero) firstLayerInstruction.setOpacity(1.0, atTime: kCMTimeZero) let secondlayerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: secondMediaTrack) secondlayerInstruction.setTransform(secondMediaTrack.preferredTransform, atTime: kCMTimeZero) secondlayerInstruction.setOpacity(alpha, atTime: kCMTimeZero) let instruction = AVMutableVideoCompositionInstruction() instruction.timeRange = CMTimeRangeMake(kCMTimeZero, min(firstAsset.duration, secondAsset.duration)) instruction.layerInstructions = [firstLayerInstruction, secondlayerInstruction] videoComposition.instructions = [instruction] } let outputUrl = VideoMaskingUtils.getPathForTempFileNamed("output.mov") VideoMaskingUtils.exportCompositedVideo(mixComposition, toURL: outputUrl, withVideoComposition: videoComposition) VideoMaskingUtils.removeTempFileAtPath(outputUrl.absoluteString) } 

这是我的exportCompositedVideo函数。

 private class func exportCompositedVideo(compiledVideo: AVMutableComposition, toURL outputUrl: NSURL, withVideoComposition videoComposition: AVMutableVideoComposition) { guard let exporter = AVAssetExportSession(asset: compiledVideo, presetName: AVAssetExportPresetHighestQuality) else { return } exporter.outputURL = outputUrl exporter.videoComposition = videoComposition exporter.outputFileType = AVFileTypeQuickTimeMovie exporter.shouldOptimizeForNetworkUse = true exporter.exportAsynchronouslyWithCompletionHandler({ switch exporter.status { case .Completed: // we can be confident that there is a URL because // we got this far. Otherwise it would've failed. UISaveVideoAtPathToSavedPhotosAlbum(exporter.outputURL!.path!, nil, nil, nil) print("VideoMaskingUtils.exportVideo SUCCESS!") if exporter.error != nil { print("VideoMaskingUtils.exportVideo Error: \(exporter.error)") print("VideoMaskingUtils.exportVideo Description: \(exporter.description)") } NSNotificationCenter.defaultCenter().postNotificationName("videoExportDone", object: exporter.error) break case .Exporting: let progress = exporter.progress print("VideoMaskingUtils.exportVideo \(progress)") NSNotificationCenter.defaultCenter().postNotificationName("videoExportProgress", object: progress) break case .Failed: print("VideoMaskingUtils.exportVideo Error: \(exporter.error)") print("VideoMaskingUtils.exportVideo Description: \(exporter.description)") NSNotificationCenter.defaultCenter().postNotificationName("videoExportDone", object: exporter.error) break default: break } }) } 

你的min应该是max ……

替换此行

 instruction.timeRange = CMTimeRangeMake(kCMTimeZero, min(firstAsset.duration, secondAsset.duration)) 

有了这条线,它将起作用:

 instruction.timeRange = CMTimeRangeMake(kCMTimeZero, max(firstAsset.duration, secondAsset.duration))