video在应用AVVideoComposition后旋转

AVVideoComposition应用到我的AVPlayerItem ,我应用的滤镜可以工作,但是video在AVPlayerLayer旋转。

我知道这个问题是不是与过滤的框架,因为如果我在UIImageView显示帧,框架呈现100%正确。

video正确显示, 直到我申请一个videoComposition 。 在AVPlayerLayer上设置videoGravity不起作用。

video顺时针旋转90度并在图层中拉伸。

本质上,在AVPlayerLayer通过AVMutableVideoComposition提供之前,video在AVPlayerLayer完美显示。 一旦发生这种情况,video旋转了-90º,然后在过滤之前缩放到与video尺寸相同的尺寸。 这向我暗示,它并没有意识到它的转换已经是正确的了,所以它正在重新应用自身的转换。

为什么会发生这种情况,我该如何解决?

这里是一些代码:

 private func filterVideo(with filter: Filter?) { if let player = player, let playerItem = player.currentItem { let composition = AVMutableComposition() let videoAssetTrack = playerItem.asset.tracks(withMediaType: .video).first let videoCompositionTrack = composition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid) try? videoCompositionTrack?.insertTimeRange(CMTimeRange(start: kCMTimeZero, duration: playerItem.asset.duration), of: videoAssetTrack!, at: kCMTimeZero) videoCompositionTrack?.preferredTransform = videoAssetTrack!.preferredTransform let videoComposition = AVMutableVideoComposition(asset: composition, applyingCIFiltersWithHandler: { (request) in let filteredImage = <...> request.finish(with: filteredImage, context: nil) }) playerItem.videoComposition = videoComposition } } 

你在AVVideoComposition的渲染AVVideoComposition有问题,你应该对AVMutableVideoCompositionInstruction应用变换(即Rotatetranslate变换)。

我在Objective-C中完成了我发布我的代码,您可以将语法转换成Swift

Objective-C的

  //------------------------------------ // FIXING ORIENTATION //------------------------------------ AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeAdd(firstAsset.duration, secondAsset.duration)); AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack]; // second AVAssetTrack *FirstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; UIImageOrientation FirstAssetOrientation_ = UIImageOrientationUp; BOOL isFirstAssetPortrait_ = NO; CGAffineTransform firstTransform = FirstAssetTrack.preferredTransform; if(firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0) {FirstAssetOrientation_= UIImageOrientationRight; isFirstAssetPortrait_ = YES;} if(firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0) {FirstAssetOrientation_ = UIImageOrientationLeft; isFirstAssetPortrait_ = YES;} if(firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0) {FirstAssetOrientation_ = UIImageOrientationUp;} if(firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0) {FirstAssetOrientation_ = UIImageOrientationDown;} CGFloat FirstAssetScaleToFitRatio = 320.0/FirstAssetTrack.naturalSize.width; if(isFirstAssetPortrait_){ FirstAssetScaleToFitRatio = 320.0/FirstAssetTrack.naturalSize.height; CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio); [FirstlayerInstruction setTransform:CGAffineTransformConcat(FirstAssetTrack.preferredTransform, FirstAssetScaleFactor) atTime:kCMTimeZero]; }else{ CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio); [FirstlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(FirstAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:kCMTimeZero]; } [FirstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration]; AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack]; AVAssetTrack *SecondAssetTrack = [[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; UIImageOrientation SecondAssetOrientation_ = UIImageOrientationUp; BOOL isSecondAssetPortrait_ = NO; CGAffineTransform secondTransform = SecondAssetTrack.preferredTransform; if(secondTransform.a == 0 && secondTransform.b == 1.0 && secondTransform.c == -1.0 && secondTransform.d == 0) {SecondAssetOrientation_= UIImageOrientationRight; isSecondAssetPortrait_ = YES;} if(secondTransform.a == 0 && secondTransform.b == -1.0 && secondTransform.c == 1.0 && secondTransform.d == 0) {SecondAssetOrientation_ = UIImageOrientationLeft; isSecondAssetPortrait_ = YES;} if(secondTransform.a == 1.0 && secondTransform.b == 0 && secondTransform.c == 0 && secondTransform.d == 1.0) {SecondAssetOrientation_ = UIImageOrientationUp;} if(secondTransform.a == -1.0 && secondTransform.b == 0 && secondTransform.c == 0 && secondTransform.d == -1.0) {SecondAssetOrientation_ = UIImageOrientationDown;} CGFloat SecondAssetScaleToFitRatio = 320.0/SecondAssetTrack.naturalSize.width; if(isSecondAssetPortrait_){ SecondAssetScaleToFitRatio = 320.0/SecondAssetTrack.naturalSize.height; CGAffineTransform SecondAssetScaleFactor = CGAffineTransformMakeScale(SecondAssetScaleToFitRatio,SecondAssetScaleToFitRatio); [SecondlayerInstruction setTransform:CGAffineTransformConcat(SecondAssetTrack.preferredTransform, SecondAssetScaleFactor) atTime:firstAsset.duration]; }else{ ; CGAffineTransform SecondAssetScaleFactor = CGAffineTransformMakeScale(SecondAssetScaleToFitRatio,SecondAssetScaleToFitRatio); [SecondlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(SecondAssetTrack.preferredTransform, SecondAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:secondAsset.duration]; } MainInstruction.layerInstructions = [NSArray arrayWithObjects:SecondlayerInstruction,nil];; AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition]; MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction]; MainCompositionInst.frameDuration = CMTimeMake(1, 30); MainCompositionInst.renderSize = CGSizeMake(320.0, 480.0); // Now , you have Orientation Fixed Instrucation layer // add this composition to your video 😀 // If you want to export Video than you can do like below NSString *documentsDirectory = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"final_merged_video-%d.mp4",arc4random() % 1000]]; NSURL *url = [NSURL fileURLWithPath:myPathDocs]; // 5 - Create exporter AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset640x480]; exporter.outputURL=url; exporter.videoComposition=MainCompositionInst; exporter.outputFileType = AVFileTypeQuickTimeMovie; exporter.shouldOptimizeForNetworkUse = YES; [exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^ { [[AppDelegate Getdelegate] hideIndicator]; [self exportDidFinish:exporter]; }); }]; 

对于Swift看到这个答案点击这里

此外,您还可以尝试通过对其旋转转换来旋转video图层。

 #define degreeToRadian(x) (M_PI * x / 180.0) [_playerLayer setAffineTransform:CGAffineTransformMakeRotation(degreeToRad‌​ian(degree))] 

而不是假定图像将被过滤,请先检查filteredImage是否nil 。 如果没有,那么request.finish(with: filteredImage, context: nil)

但是,如果它是nil你必须request.finish(with: SomeError)

这是根据文档。

如果您尝试播放AVMutableCompostion您应该将AVAssetTrackpreferredTransform设置为AVMutableCompositionTrackpreferredTransform

 let asset = AVAsset(url: url!) let composition = AVMutableComposition() let compositionTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid) let videoTrack = asset.tracks(withMediaType: AVMediaTypeVideo).first try? compositionTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, asset.duration), of: videoTrack!, at: kCMTimeZero) compositionTrack.preferredTransform = (videoTrack?.preferredTransform)! let playerItem = AVPlayerItem(asset: composition) let filter = CIFilter(name: "CIColorInvert") playerItem.videoComposition = AVVideoComposition(asset: composition, applyingCIFiltersWithHandler: { (request: AVAsynchronousCIImageFilteringRequest) in filter?.setValue(request.sourceImage, forKey: kCIInputImageKey) request.finish(with: (filter?.outputImage)!, context: nil) }) .... the rest of code 

试试下面的代码,这对我工作

 // Grab the source track from AVURLAsset for example. let assetV = YourAVASSET.tracks(withMediaType: AVMediaTypeVideo).last // Grab the composition video track from AVMutableComposition you already made. let compositionV = YourCompostion.tracks(withMediaType: AVMediaTypeVideo).last // Apply the original transform. f ((assetV != nil) && (compostionV != nil)) { compostionV?.preferredTransform = (assetV?.preferredTransform)! } 

然后继续导出您的video….