iOS AVFoundation:设置video的方向

我一直在努力解决在iOS设备上捕获过程中和之后的video方向问题。 由于之前的答案和苹果的文件,我已经能够弄清楚了。 不过,现在我想把一些video推到一个网站,我遇到了一些特殊的问题。 我在这个问题中特别提到了这个问题 ,并且提出的解决scheme需要在video编码期间设置定向选项。

这可能是,但我不知道如何去做这件事。 关于设置方向的文档是关于正确设置以显示在设备上的,我已经实现了在这里find的build议。 但是,这个build议并不能解决非苹果软件(如VLC或Chrome浏览器)的正确设置方向。

任何人都可以提供深入了解如何正确设置方向的设备,使其显示正确的所有查看软件?

最后,基于@Aaron Vegh和@Pince的回答,我想出了我的决议://转换video

 +(void)convertMOVToMp4:(NSString *)movFilePath completion:(void (^)(NSString *mp4FilePath))block{ AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:movFilePath] options:nil]; AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; AVMutableComposition* composition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:sourceAudioTrack atTime:kCMTimeZero error:nil]; AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality]; NSString *exportPath = [movFilePath stringByReplacingOccurrencesOfString:@".MOV" withString:@".mp4"]; NSURL * exportUrl = [NSURL fileURLWithPath:exportPath]; assetExport.outputFileType = AVFileTypeMPEG4; assetExport.outputURL = exportUrl; assetExport.shouldOptimizeForNetworkUse = YES; assetExport.videoComposition = [self getVideoComposition:videoAsset composition:composition]; [assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) { switch (assetExport.status) { case AVAssetExportSessionStatusCompleted: // export complete if (block) { block(exportPath); } break; case AVAssetExportSessionStatusFailed: block(nil); break; case AVAssetExportSessionStatusCancelled: block(nil); break; } }]; } 

//获取当前的方向

  +(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset composition:( AVMutableComposition*)composition{ BOOL isPortrait_ = [self isVideoPortrait:asset]; AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil]; AVMutableVideoCompositionLayerInstruction *layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack]; CGAffineTransform transform = videoTrack.preferredTransform; [layerInst setTransform:transform atTime:kCMTimeZero]; AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration); inst.layerInstructions = [NSArray arrayWithObject:layerInst]; AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.instructions = [NSArray arrayWithObject:inst]; CGSize videoSize = videoTrack.naturalSize; if(isPortrait_) { NSLog(@"video is portrait "); videoSize = CGSizeMake(videoSize.height, videoSize.width); } videoComposition.renderSize = videoSize; videoComposition.frameDuration = CMTimeMake(1,30); videoComposition.renderScale = 1.0; return videoComposition; } 

//获取video

 +(BOOL) isVideoPortrait:(AVAsset *)asset{ BOOL isPortrait = FALSE; NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo]; if([tracks count] > 0) { AVAssetTrack *videoTrack = [tracks objectAtIndex:0]; CGAffineTransform t = videoTrack.preferredTransform; // Portrait if(ta == 0 && tb == 1.0 && tc == -1.0 && td == 0) { isPortrait = YES; } // PortraitUpsideDown if(ta == 0 && tb == -1.0 && tc == 1.0 && td == 0) { isPortrait = YES; } // LandscapeRight if(ta == 1.0 && tb == 0 && tc == 0 && td == 1.0) { isPortrait = FALSE; } // LandscapeLeft if(ta == -1.0 && tb == 0 && tc == 0 && td == -1.0) { isPortrait = FALSE; } } return isPortrait; 

}

在苹果公司的文档中,它指出:

客户现在可以在其AVCaptureVideoDataOutput -captureOutput:didOutputSampleBuffer:fromConnection:delegatecallback中接收物理旋转的CVPixelBuffers。 在以前的iOS版本中,前置摄像头将始终在AVCaptureVideoOrientationLandscapeLeft中提供缓冲区,而后置摄像头将始终在AVCaptureVideoOrientationLandscapeRight中提供缓冲区。 支持所有4个AVCaptureVideoOrientations,旋转硬件加速。 要请求缓冲区旋转,客户端调用AVCaptureVideoDataOutput的videoAVCaptureConnection上的-setVideoOrientation:。 请注意,物理旋转的缓冲区确实带有性能成本,所以只有在必要时才请求旋转。 例如,如果要使用AVAssetWriter将旋转的video写入QuickTime影片文件,则最好在AVAssetWriterInput上设置-transform属性,而不是物理旋转AVCaptureVideoDataOutput中的缓冲区。

因此,使用AVAssetExportSession的Aaron Vegh发布的解决scheme有效,但不是必需的。 就像Apple文档所说的,如果您想要正确设置方向,以便在非苹果QuickTime播放器(如VLC或使用Chrome浏览器)中播放,则必须在AVCaptureConnection上为AVCaptureVideoDataOutput设置video方向。 如果您尝试将其设置为AVAssetWriterInput,您将获得VLC和Chrome等玩家的不正确方向。

这里是我设置捕获会话期间的代码:

 // DECLARED AS PROPERTIES ABOVE @property (strong,nonatomic) AVCaptureDeviceInput *audioIn; @property (strong,nonatomic) AVCaptureAudioDataOutput *audioOut; @property (strong,nonatomic) AVCaptureDeviceInput *videoIn; @property (strong,nonatomic) AVCaptureVideoDataOutput *videoOut; @property (strong,nonatomic) AVCaptureConnection *audioConnection; @property (strong,nonatomic) AVCaptureConnection *videoConnection; ------------------------------------------------------------------ ------------------------------------------------------------------ -(void)setupCaptureSession{ // Setup Session self.session = [[AVCaptureSession alloc]init]; [self.session setSessionPreset:AVCaptureSessionPreset640x480]; // Create Audio connection ---------------------------------------- self.audioIn = [[AVCaptureDeviceInput alloc]initWithDevice:[self getAudioDevice] error:nil]; if ([self.session canAddInput:self.audioIn]) { [self.session addInput:self.audioIn]; } self.audioOut = [[AVCaptureAudioDataOutput alloc]init]; dispatch_queue_t audioCaptureQueue = dispatch_queue_create("Audio Capture Queue", DISPATCH_QUEUE_SERIAL); [self.audioOut setSampleBufferDelegate:self queue:audioCaptureQueue]; if ([self.session canAddOutput:self.audioOut]) { [self.session addOutput:self.audioOut]; } self.audioConnection = [self.audioOut connectionWithMediaType:AVMediaTypeAudio]; // Create Video connection ---------------------------------------- self.videoIn = [[AVCaptureDeviceInput alloc]initWithDevice:[self videoDeviceWithPosition:AVCaptureDevicePositionBack] error:nil]; if ([self.session canAddInput:self.videoIn]) { [self.session addInput:self.videoIn]; } self.videoOut = [[AVCaptureVideoDataOutput alloc]init]; [self.videoOut setAlwaysDiscardsLateVideoFrames:NO]; [self.videoOut setVideoSettings:nil]; dispatch_queue_t videoCaptureQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL); [self.videoOut setSampleBufferDelegate:self queue:videoCaptureQueue]; if ([self.session canAddOutput:self.videoOut]) { [self.session addOutput:self.videoOut]; } self.videoConnection = [self.videoOut connectionWithMediaType:AVMediaTypeVideo]; // SET THE ORIENTATION HERE ------------------------------------------------- [self.videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait]; // -------------------------------------------------------------------------- // Create Preview Layer ------------------------------------------- AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session]; CGRect bounds = self.videoView.bounds; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; previewLayer.bounds = bounds; previewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds)); [self.videoView.layer addSublayer:previewLayer]; // Start session [self.session startRunning]; 

}

如果其他人也在寻找这个答案,这是我煮熟的方法(修改了一下,以简化):

 - (void)encodeVideoOrientation:(NSURL *)anOutputFileURL { CGAffineTransform rotationTransform; CGAffineTransform rotateTranslate; CGSize renderSize; switch (self.recordingOrientation) { // set these 3 values based on orientation } AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:anOutputFileURL options:nil]; AVAssetTrack *sourceVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVAssetTrack *sourceAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; AVMutableComposition* composition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil]; [compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform]; AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:sourceAudioTrack atTime:kCMTimeZero error:nil]; AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack]; [layerInstruction setTransform:rotateTranslate atTime:kCMTimeZero]; AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.frameDuration = CMTimeMake(1,30); videoComposition.renderScale = 1.0; videoComposition.renderSize = renderSize; instruction.layerInstructions = [NSArray arrayWithObject: layerInstruction]; instruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration); videoComposition.instructions = [NSArray arrayWithObject: instruction]; AVAssetExportSession * assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality]; NSString* videoName = @"export.mov"; NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName]; NSURL * exportUrl = [NSURL fileURLWithPath:exportPath]; if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) { [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil]; } assetExport.outputFileType = AVFileTypeMPEG4; assetExport.outputURL = exportUrl; assetExport.shouldOptimizeForNetworkUse = YES; assetExport.videoComposition = videoComposition; [assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) { switch (assetExport.status) { case AVAssetExportSessionStatusCompleted: // export complete NSLog(@"Export Complete"); break; case AVAssetExportSessionStatusFailed: NSLog(@"Export Failed"); NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]); // export error (see exportSession.error) break; case AVAssetExportSessionStatusCancelled: NSLog(@"Export Failed"); NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]); // export cancelled break; } }]; } 

不幸的是,这个东西很糟糕,但是通过从其他SO问题和阅读头文件一起串接示例,我能够得到这个工作。 希望这可以帮助其他人!

使用以下method根据AVMutableVideoComposition video asset orientation设置correct orientation

 -(AVMutableVideoComposition *) getVideoComposition:(AVAsset *)asset { AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableComposition *composition = [AVMutableComposition composition]; AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition]; CGSize videoSize = videoTrack.naturalSize; BOOL isPortrait_ = [self isVideoPortrait:asset]; if(isPortrait_) { NSLog(@"video is portrait "); videoSize = CGSizeMake(videoSize.height, videoSize.width); } composition.naturalSize = videoSize; videoComposition.renderSize = videoSize; // videoComposition.renderSize = videoTrack.naturalSize; // videoComposition.frameDuration = CMTimeMakeWithSeconds( 1 / videoTrack.nominalFrameRate, 600); AVMutableCompositionTrack *compositionVideoTrack; compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil]; AVMutableVideoCompositionLayerInstruction *layerInst; layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; [layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero]; AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration); inst.layerInstructions = [NSArray arrayWithObject:layerInst]; videoComposition.instructions = [NSArray arrayWithObject:inst]; return videoComposition; } -(BOOL) isVideoPortrait:(AVAsset *)asset { BOOL isPortrait = FALSE; NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo]; if([tracks count] > 0) { AVAssetTrack *videoTrack = [tracks objectAtIndex:0]; CGAffineTransform t = videoTrack.preferredTransform; // Portrait if(ta == 0 && tb == 1.0 && tc == -1.0 && td == 0) { isPortrait = YES; } // PortraitUpsideDown if(ta == 0 && tb == -1.0 && tc == 1.0 && td == 0) { isPortrait = YES; } // LandscapeRight if(ta == 1.0 && tb == 0 && tc == 0 && td == 1.0) { isPortrait = FALSE; } // LandscapeLeft if(ta == -1.0 && tb == 0 && tc == 0 && td == -1.0) { isPortrait = FALSE; } } return isPortrait; } 

从iOS 5开始,您可以使用此处logging的AVCaptureVideoDataOutput请求旋转的CVPixelBuffers。 这使您无需使用AVAssetExportSession重新处理video即可获得正确的方向。