如何在iPhone上添加video文字

我想在MPMoviePlayerViewController正在播放的video上添加一个文本或string,使其成为正在播放的video的一部分。 所以,当我在Facebook或Twitter上发布video时,文本必须显示在video上方。

为此,我尝试获取video的所有帧,然后在每个帧上写入文本,然后再制作所有这些帧的video。 但是这样,我得到内存问题,并在设备上崩溃。

- (NSArray*)getVideoFramesFromMovieController:(MPMoviePlayerViewController*)mpMoviePlayerVC { NSLog(@"Getting frames from a video asset."); // videoFrames = [NSMutableArray array]; NSMutableArray *videoFrames = [NSMutableArray array]; for(float i= 0; i <= mpMoviePlayerVC.moviePlayer.duration; ) { UIImage *singleFrameImage = [mpMoviePlayerVC.moviePlayer thumbnailImageAtTime:i timeOption:MPMovieTimeOptionExact]; [videoFrames addObject:singleFrameImage]; NSLog(@"Got frame number : %d",[videoFrames count]); i = i + (1/self.frameRate) ; //frame capturing duration ie 15fps //self.frameRate } NSLog(@"Total frames: %d",[videoFrames count]); return [NSArray arrayWithArray:videoFrames]; } 

上面的方法给我所有的框架,我写在所有这些文本的文本,说“你好”,然后制作所有这些帧的video。

  -(void)writeImageAsMovie:(NSArray *)array toPath:(NSString*)path size:(CGSize)size { NSLog(@"Inside writeImageAsMovie method."); NSError *error = nil; AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL: [NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4 error:&error]; NSParameterAssert(videoWriter); NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, [NSNumber numberWithInt:size.width], AVVideoWidthKey, [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil]; AVAssetWriterInput* writerInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain]; AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil]; NSParameterAssert(writerInput); NSParameterAssert([videoWriter canAddInput:writerInput]); [videoWriter addInput:writerInput]; //Start a session: [videoWriter startWriting]; [videoWriter startSessionAtSourceTime:kCMTimeZero]; dispatch_queue_t dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL); int __block frame = 0; [writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{ while ([writerInput isReadyForMoreMediaData]) { NSLog(@"Total frames to be written: %d",[array count]); if(++frame >= [array count]) //total frames { [writerInput markAsFinished]; [videoWriter finishWriting]; [videoWriter release]; break; } CVPixelBufferRef buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[[array objectAtIndex:frame]CGImage] andSize:size]; if (buffer) { if(![adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(frame, self.frameRate)]) NSLog(@"FAIL"); else NSLog(@"Success:%d", frame); CFRelease(buffer); } } }]; NSLog(@"outside for loop"); [self performSelector:@selector(waitTillVideoFinishes) withObject:nil afterDelay:20.0]; } 

它在Mac上工作正常,但由于内存问题在设备上崩溃。

我也尝试过用各种方法来对video上的文字进行水印处理,但是无法通过。 提前致谢。

我知道这是一个老问题,但我只是有相同的要求,我找不到任何真正简单快捷的解决scheme(为了快速testing我能做到这一点),直到我发现这个GPUImage问题:#110 。

Brad Larsonbuild议,使用GPUImage框架可以在video上录制带有现有滤镜或/和UIKit元素的video。

因此,根据#110中的信息,我将GPUImage FilterShowcase项目与SimpleVideoFilter项目结合在一起,快速testing是否可行。

当然,它是有效的。

所以我写这个答案提供快速示例如何做到这一点:

打开FilterShowcase项目,

打开ShowcaseFilterViewController.m文件,并findelse if (filterType == GPUIMAGE_UIELEMENT)

和代码后,你会发现那里 – 添加此代码:

 NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"]; unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie]; GPUImageMovieWriter *movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)]; [blendFilter addTarget:movieWriter]; [videoCamera startCameraCapture]; double delayToStartRecording = 0.5; dispatch_time_t startTime2 = dispatch_time(DISPATCH_TIME_NOW, delayToStartRecording * NSEC_PER_SEC); dispatch_after(startTime2, dispatch_get_main_queue(), ^(void){ NSLog(@"Start recording"); videoCamera.audioEncodingTarget = movieWriter; [movieWriter startRecording]; double delayInSeconds = 5.0; dispatch_time_t stopTime = dispatch_time(DISPATCH_TIME_NOW, delayInSeconds * NSEC_PER_SEC); dispatch_after(stopTime, dispatch_get_main_queue(), ^(void){ [blendFilter removeTarget:movieWriter]; videoCamera.audioEncodingTarget = nil; [movieWriter finishRecording]; NSLog(@"Movie completed"); UISaveVideoAtPathToSavedPhotosAlbum(pathToMovie, nil, NULL, NULL); }); }); [videoCamera startCameraCapture]; return; 

之后,在iPhone上编译FilterShowCase项目,并从列表中selectUI element 。 然后大约7-10秒后,将有一个video保存在您的照片库。

此代码是在video上添加文本或string,保存video后,您将在任何播放器上播放。 这个代码的最大优点是提供有声video。

 #import <AVFoundation/AVFoundation.h> -(void)MixVideoWithText { AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:url options:nil]; AVMutableComposition* mixComposition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; AVAssetTrack *clipAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; //If you need audio as well add the Asset Track for audio here [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil]; [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil]; [compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]]; CGSize sizeOfVideo=[videoAsset naturalSize]; //NSLog(@"sizeOfVideo.width is %f",sizeOfVideo.width); //NSLog(@"sizeOfVideo.height is %f",sizeOfVideo.height); //TextLayer defines the text they want to add in Video CATextLayer *textOfvideo=[[CATextLayer alloc] init]; textOfvideo.string=[NSString stringWithFormat:@"%@",text];//text is shows the text that you want add in video. [textOfvideo setFont:(__bridge CFTypeRef)([UIFont fontWithName:[NSString stringWithFormat:@"%@",fontUsed] size:13])];//fontUsed is the name of font [textOfvideo setFrame:CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height/6)]; [textOfvideo setAlignmentMode:kCAAlignmentCenter]; [textOfvideo setForegroundColor:[selectedColour CGColor]]; CALayer *optionalLayer=[CALayer layer]; [optionalL addSublayer:textOfvideo]; optionalL.frame=CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height); [optionalL setMasksToBounds:YES]; CALayer *parentLayer=[CALayer layer]; CALayer *videoLayer=[CALayer layer]; parentLayer.frame=CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height); videoLayer.frame=CGRectMake(0, 0, sizeOfVideo.width, sizeOfVideo.height); [parentLayer addSublayer:videoLayer]; [parentLayer addSublayer:optionalLayer]; AVMutableVideoComposition *videoComposition=[AVMutableVideoComposition videoComposition] ; videoComposition.frameDuration=CMTimeMake(1, 10); videoComposition.renderSize=sizeOfVideo; videoComposition.animationTool=[AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]); AVAssetTrack *videoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction]; videoComposition.instructions = [NSArray arrayWithObject: instruction]; NSString *documentsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)objectAtIndex:0]; NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init]; [dateFormatter setDateFormat:@"yyyy-MM-dd_HH-mm-ss"]; NSString *destinationPath = [documentsDirectory stringByAppendingFormat:@"/utput_%@.mov", [dateFormatter stringFromDate:[NSDate date]]]; AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality]; exportSession.videoComposition=videoComposition; exportSession.outputURL = [NSURL fileURLWithPath:destinationPath]; exportSession.outputFileType = AVFileTypeQuickTimeMovie; [exportSession exportAsynchronouslyWithCompletionHandler:^{ switch (exportSession.status) { case AVAssetExportSessionStatusCompleted: NSLog(@"Export OK"); if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(destinationPath)) { UISaveVideoAtPathToSavedPhotosAlbum(destinationPath, self, @selector(video:didFinishSavingWithError:contextInfo:), nil); } break; case AVAssetExportSessionStatusFailed: NSLog (@"AVAssetExportSessionStatusFailed: %@", exportSession.error); break; case AVAssetExportSessionStatusCancelled: NSLog(@"Export Cancelled"); break; } }]; } 

显示保存video后会出现的错误。

 -(void) video: (NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo { if(error) NSLog(@"Finished saving video with error: %@", error); } 
Interesting Posts