通过覆盖GIF图像保存video

我正在制作一个录制video的应用程序。 录制完成后,我使用Library将GIF图像放在上面。

我的代码播放video和GIF图像作为覆盖

self.avPlayer = [AVPlayer playerWithURL:self.urlstring]; self.avPlayer.actionAtItemEnd = AVPlayerActionAtItemEndNone; AVPlayerLayer *videoLayer = [AVPlayerLayer playerLayerWithPlayer:self.avPlayer]; videoLayer.frame = self.preview_view.bounds; videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [self.preview_view.layer addSublayer:videoLayer]; NSURL *url = [[NSBundle mainBundle] URLForResource:@"02" withExtension:@"gif"]; self.img_gif.image = [UIImage animatedImageWithAnimatedGIFData:[NSData dataWithContentsOfURL:url]]; 

但是现在我想合并并保存这个GIF图像覆盖的video。 我谷歌没有find我想要的。

感谢您的帮助

GIF图像

这是将video与GIF图像合并的最佳答案。

 - (void)mixVideoAsset:(AVAsset *)videoAsset { NSDate * begin = [NSDate date]; // 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances. AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init]; // 3 - Video track AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil]; // - Audio AVMutableCompositionTrack *audioCompositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; AVAssetTrack *audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; [audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioTrack.timeRange.duration) ofTrack:audioTrack atTime:kCMTimeZero error:nil]; // 3.1 - Create AVMutableVideoCompositionInstruction AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration); // 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation. AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp; BOOL isVideoAssetPortrait_ = NO; CGAffineTransform videoTransform = videoAssetTrack.preferredTransform; if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) { videoAssetOrientation_ = UIImageOrientationRight; isVideoAssetPortrait_ = YES; } if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) { videoAssetOrientation_ = UIImageOrientationLeft; isVideoAssetPortrait_ = YES; } if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) { videoAssetOrientation_ = UIImageOrientationUp; } if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) { videoAssetOrientation_ = UIImageOrientationDown; } [videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero]; [videolayerInstruction setOpacity:0.0 atTime:videoAsset.duration]; // 3.3 - Add instructions mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil]; AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition]; CGSize naturalSize; if(isVideoAssetPortrait_){ naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width); } else { naturalSize = videoAssetTrack.naturalSize; } float renderWidth, renderHeight; renderWidth = naturalSize.width; renderHeight = naturalSize.height; mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight); mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction]; mainCompositionInst.frameDuration = CMTimeMake(1, 30); // Watermark Layers [self applyVideoEffectsToComposition:mainCompositionInst size:naturalSize]; // 4 - Get path NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent: [NSString stringWithFormat:@"FinalVideo-%d.mov",arc4random() % 1000]]; NSURL *url = [NSURL fileURLWithPath:myPathDocs]; // NSURL * url = TempVideoURL(); // 5 - Create exporter AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality]; exporter.outputURL=url; exporter.outputFileType = AVFileTypeMPEG4; exporter.shouldOptimizeForNetworkUse = YES; exporter.videoComposition = mainCompositionInst; [exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ NSDate * endDate = [NSDate date]; NSTimeInterval interval = [endDate timeIntervalSinceDate:begin]; NSLog(@"completed %f senconds",interval); ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init]; if ([assetsLibrary videoAtPathIsCompatibleWithSavedPhotosAlbum:exporter.outputURL]) { [assetsLibrary writeVideoAtPathToSavedPhotosAlbum:exporter.outputURL completionBlock:NULL]; } }); }]; } - (void)applyVideoEffectsToComposition:(AVMutableVideoComposition *)composition size:(CGSize)size { // - set up the parent layer CALayer *parentLayer = [CALayer layer]; CALayer *videoLayer = [CALayer layer]; parentLayer.frame = CGRectMake(0, 0, size.width, size.height); videoLayer.frame = CGRectMake(0, 0, size.width, size.height); [parentLayer addSublayer:videoLayer]; size.width = 100; size.height = 100; // - set up the overlay CALayer *overlayLayer = [CALayer layer]; overlayLayer.frame = CGRectMake(0, 100, size.width, size.height); NSURL *fileUrl = [[NSBundle mainBundle] URLForResource:@"jiafei" withExtension:@"gif"]; [self startGifAnimationWithURL:fileUrl inLayer:overlayLayer]; // UIImage * image = [UIImage imageNamed:@"gifImage.gif"]; // [overlayLayer setContents:(id)[image CGImage]]; // [overlayLayer setMasksToBounds:YES]; [parentLayer addSublayer:overlayLayer]; // - apply magic composition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; } - (void)startGifAnimationWithURL:(NSURL *)url inLayer:(CALayer *)layer { CAKeyframeAnimation * animation = [self animationForGifWithURL:url]; [layer addAnimation:animation forKey:@"contents"]; } - (CAKeyframeAnimation *)animationForGifWithURL:(NSURL *)url { CAKeyframeAnimation *animation = [CAKeyframeAnimation animationWithKeyPath:@"contents"]; NSMutableArray * frames = [NSMutableArray new]; NSMutableArray *delayTimes = [NSMutableArray new]; CGFloat totalTime = 0.0; CGFloat gifWidth; CGFloat gifHeight; CGImageSourceRef gifSource = CGImageSourceCreateWithURL((CFURLRef)url, NULL); // get frame count size_t frameCount = CGImageSourceGetCount(gifSource); for (size_t i = 0; i < frameCount; ++i) { // get each frame CGImageRef frame = CGImageSourceCreateImageAtIndex(gifSource, i, NULL); [frames addObject:(__bridge id)frame]; CGImageRelease(frame); // get gif info with each frame NSDictionary *dict = (NSDictionary*)CFBridgingRelease(CGImageSourceCopyPropertiesAtIndex(gifSource, i, NULL)); NSLog(@"kCGImagePropertyGIFDictionary %@", [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary]); // get gif size gifWidth = [[dict valueForKey:(NSString*)kCGImagePropertyPixelWidth] floatValue]; gifHeight = [[dict valueForKey:(NSString*)kCGImagePropertyPixelHeight] floatValue]; // kCGImagePropertyGIFDictionary中kCGImagePropertyGIFDelayTime,kCGImagePropertyGIFUnclampedDelayTime值是一样的NSDictionary *gifDict = [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary]; [delayTimes addObject:[gifDict valueForKey:(NSString*)kCGImagePropertyGIFDelayTime]]; totalTime = totalTime + [[gifDict valueForKey:(NSString*)kCGImagePropertyGIFDelayTime] floatValue]; CFRelease((__bridge CFTypeRef)(dict)); } if (gifSource) { CFRelease(gifSource); } NSMutableArray *times = [NSMutableArray arrayWithCapacity:3]; CGFloat currentTime = 0; NSInteger count = delayTimes.count; for (int i = 0; i < count; ++i) { [times addObject:[NSNumber numberWithFloat:(currentTime / totalTime)]]; currentTime += [[delayTimes objectAtIndex:i] floatValue]; } NSMutableArray *images = [NSMutableArray arrayWithCapacity:3]; for (int i = 0; i < count; ++i) { [images addObject:[frames objectAtIndex:i]]; } animation.keyTimes = times; animation.values = images; animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear]; animation.duration = totalTime; animation.repeatCount = HUGE_VALF; animation.beginTime = AVCoreAnimationBeginTimeAtZero; animation.removedOnCompletion = NO; return animation; } 

您可以尝试以下任何一种屏幕录制代码。 它会合并您的video和GIF。

您可以从Apple提供的以下链接下载示例。 https://developer.apple.com/library/mac/samplecode/AVScreenShack/Introduction/Intro.html

https://github.com/alskipp/ASScreenRecorder

http://codethink.no-ip.org/wordpress/archives/673

希望这可以帮助你..