如何在video上添加叠加文本,然后重新编码?

我想从我的iOS应用程序编辑video。 我想要在源video的语言字幕文本。 然后,我想要保存与覆盖文字的video。 文字不只是显示的目的。 但是当我打开编辑的video,它显示更新的video。

这是可能的iOS应用程序? 如果是这样,怎么样?

- (void)addAnimation { NSString *filePath = [[NSBundle mainBundle] pathForResource:videoName ofType:ext]; AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:filePath] options:nil]; AVMutableComposition* mixComposition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil]; [compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]]; CGSize videoSize = [clipVideoTrack naturalSize]; UIImage *myImage = [UIImage imageNamed:@"29.png"]; CALayer *aLayer = [CALayer layer]; aLayer.contents = (id)myImage.CGImage; aLayer.frame = CGRectMake(videoSize.width - 65, videoSize.height - 75, 57, 57); aLayer.opacity = 0.65; CALayer *parentLayer = [CALayer layer]; CALayer *videoLayer = [CALayer layer]; parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height); videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height); [parentLayer addSublayer:videoLayer]; [parentLayer addSublayer:aLayer]; CATextLayer *titleLayer = [CATextLayer layer]; titleLayer.string = @"Text goes here"; titleLayer.font = CFBridgingRetain(@"Helvetica"); titleLayer.fontSize = videoSize.height / 6; //?? titleLayer.shadowOpacity = 0.5; titleLayer.alignmentMode = kCAAlignmentCenter; titleLayer.bounds = CGRectMake(0, 0, videoSize.width, videoSize.height / 6); //You may need to adjust this for proper display [parentLayer addSublayer:titleLayer]; //ONLY IF WE ADDED TEXT AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition]; videoComp.renderSize = videoSize; videoComp.frameDuration = CMTimeMake(1, 30); videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]); AVAssetTrack *videoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction]; videoComp.instructions = [NSArray arrayWithObject: instruction]; AVAssetExportSession *assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];//AVAssetExportPresetPassthrough assetExport.videoComposition = videoComp; NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString* VideoName = [NSString stringWithFormat:@"%@/mynewwatermarkedvideo.mp4",documentsDirectory]; //NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:VideoName]; NSURL *exportUrl = [NSURL fileURLWithPath:VideoName]; if ([[NSFileManager defaultManager] fileExistsAtPath:VideoName]) { [[NSFileManager defaultManager] removeItemAtPath:VideoName error:nil]; } assetExport.outputFileType = AVFileTypeQuickTimeMovie; assetExport.outputURL = exportUrl; assetExport.shouldOptimizeForNetworkUse = YES; //[strRecordedFilename setString: exportPath]; [assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) { dispatch_async(dispatch_get_main_queue(), ^{ [self exportDidFinish:assetExport]; }); } ]; } -(void)exportDidFinish:(AVAssetExportSession*)session { NSURL *exportUrl = session.outputURL; ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:exportUrl]) { [library writeVideoAtPathToSavedPhotosAlbum:exportUrl completionBlock:^(NSURL *assetURL, NSError *error) { dispatch_async(dispatch_get_main_queue(), ^{ if (error) { UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Video Saving Failed" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; } else { UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Video Saved" message:@"Saved To Photo Album" delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; } }); }]; } NSLog(@"Completed"); UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"AlertView" message:@"Video is edited successfully." delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil]; [alert show]; } 

一种方法是创build您的文本覆盖作为CoreAnimation CATextLayer,将其附加到AVAssetExportSession的videoComposition,然后导出您的video。 由此产生的video将叠加到其上。

这带来了一些好处:

  1. 你不必停在CATextLayer – 你可以构造包含CAGradientLayer,CAShapeLayer的CALayer树。
  2. 作为核心animation层,他们的许多属性都是animation的,所以你可以在video中免费获得stream畅的iOS风格的animation。

听起来不错,对吧? 有一个副作用:根据您使用的导出预设,您的video将不可避免地重新编码在一个固定的帧率 – 对我来说,它是30fps。 为了保持较小的文件大小,我特意通过省略冗余帧来降低帧率,所以为了一个静态的横幅,这对我来说是一个很大的破坏者。

有一些苹果示例代码称为AVEditDemo,演示此function,等等。 有在这里find它的说明。

使用Chaitali耆那教的代码,新的video将被保存,无需audio。 有没有人在这个问题上有一个主意? 谢谢!