使用GPUImage过滤video

我在我的应用程序中使用GPUImage并试图过滤video。 实时video过滤运行良好。 当我尝试从文件系统中读取video到内存中,并使用在Sunsetlakessoftware教程页面和SimpleVideoFileFilter演示中发布的代码应用filter时,会遇到麻烦。

编辑 :我意识到,我原来的职位可能不会提出一个具体的问题。 我所要求的是:如何从磁盘读取video到内存中,应用GPUImageFilter,然后用过滤的版本覆盖原始文件?

应用程序崩溃,出现以下错误:

-[AVAssetWriter startWriting] Cannot call method when status is 2

状态2是AVAssetWriterStatusCompleted 。 我见过与其他三个AVAssetWriterStatus一样的故障。

我已经发布了下面的相关代码。

 GPUImageFilter *selectedFilter = [self.allFilters objectAtIndex:indexPath.item]; // get the file url I stored when the video was initially captured NSURL *url = [self.videoURLsByIndexPath objectForKey:self.indexPathForDisplayedImage]; GPUImageMovie *movieFile = [[GPUImageMovie alloc] initWithURL:url]; movieFile.runBenchmark = YES; movieFile.playAtActualSpeed = NO; [movieFile addTarget:selectedFilter]; // apply the user-selected filter to the file unlink([url.absoluteString UTF8String]); // delete the file that was at that file URL so it's writeable // A different movie writer than the one I was using for live video capture. GPUImageMovieWriter *editingMovieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:url size:CGSizeMake(640.0, 640.0)]; [selectedFilter addTarget:editingMovieWriter]; editingMovieWriter.shouldPassthroughAudio = YES; movieFile.audioEncodingTarget = editingMovieWriter; [movieFile enableSynchronizedEncodingUsingMovieWriter:editingMovieWriter]; [editingMovieWriter startRecording]; [movieFile startProcessing]; // Commenting out this line prevents crash // weak variables to prevent retain cycle __weak GPUImageMovieWriter *weakWriter = editingMovieWriter; __weak id weakSelf = self; [editingMovieWriter setCompletionBlock:^{ [selectedFilter removeTarget:weakWriter]; [weakWriter finishRecording]; [weakSelf savePhotosToLibrary]; // use ALAssetsLibrary to write to camera roll }]; 

也许我的问题是编辑电影写作者的范围。 或者,也许我正在初始化一个GPUImageMovie实例,它使用了我正试图写入的URL。 我已经在GPUImage github的问题页面上阅读了几篇文章,SO上的几个相关文章,自述文件和上面链接的教程。

任何深入了解这个问题将不胜感激。 谢谢。

这里至less有一件事情可能在这背后。 在上面的代码中,您并不是强调对movieFile源对象的强引用。

如果这是一个支持ARC的项目,那么当你完成你的设置方法的时候,这个对象将会被释放(如果不是的话,你将会泄漏这个对象)。 这将停止电影播放,释放电影本身,并导致黑帧被发送到filterpipe道(其他潜在的不稳定性)。

您需要将movieFile为强参考的实例variables,以确保它挂在过去的设置方法上,因为所有的电影处理都是asynchronous的。

 Here is solution : Declare it var movieFile: GPUImageMovie! var gpuImage: GPUImagePicture! var sourcePicture: GPUImagePicture! var sepiaFilter: GPUImageOutput! var sepiaFilter2: GPUImageInput! var movieWriter : GPUImageMovieWriter! var filter: GPUImageInput! //Filter image func StartWriting() { // Step - 1 pass url to avasset let loadingNotification = MBProgressHUD.showHUDAddedTo(self.view, animated: true) loadingNotification.mode = MBProgressHUDMode.Indeterminate loadingNotification.labelText = "Loading" let documentsURL1 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL let pathToMovie = documentsURL1.URLByAppendingPathComponent("temp.mov") self.movieFile = GPUImageMovie(URL: pathToMovie) self.movieFile.runBenchmark = true self.movieFile.playAtActualSpeed = false self.filter = GPUImageGrayscaleFilter() self.sepiaFilter = GPUImageGrayscaleFilter() self.movieFile.addTarget(self.filter) let documentsURL2 = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] as! NSURL self.paths = documentsURL2.URLByAppendingPathComponent("temp1.mov") var fileManager: NSFileManager = NSFileManager.defaultManager() var error: NSError fileManager.removeItemAtURL(self.paths, error: nil) let Data = NSData(contentsOfURL: pathToMovie) println( Data?.length ) var anAsset = AVAsset.assetWithURL(pathToMovie)as!AVAsset var videoAssetTrack = anAsset.tracksWithMediaType(AVMediaTypeVideo)[0]as! AVAssetTrack var videoAssetOrientation_: UIImageOrientation = .Up var isVideoAssetPortrait_: Bool = true var videoTransform: CGAffineTransform = videoAssetTrack.preferredTransform var naturalSize = CGSize() var FirstAssetScaleToFitRatio: CGFloat = 320.0 / videoAssetTrack.naturalSize.width println(naturalSize) naturalSize = videoAssetTrack.naturalSize self.movieWriter = GPUImageMovieWriter(movieURL: self.paths, size: naturalSize) let input = self.filter as! GPUImageOutput input.addTarget(self.movieWriter) self.movieWriter.shouldPassthroughAudio = true if anAsset.tracksWithMediaType(AVMediaTypeAudio).count > 0 { self.movieFile.audioEncodingTarget = self.movieWriter } else { self.movieFile.audioEncodingTarget = nil } self.movieFile.enableSynchronizedEncodingUsingMovieWriter(self.movieWriter) self.movieWriter.startRecording() self.movieFile.startProcessing() self.movieWriter.completionBlock = {() -> Void in self.movieWriter.finishRecording() self.obj.performWithAsset(self.paths) } let delayTime1 = dispatch_time(DISPATCH_TIME_NOW, Int64(15 * Double(NSEC_PER_SEC))) dispatch_after(delayTime1, dispatch_get_main_queue()) { MBProgressHUD.hideAllHUDsForView(self.view, animated: true) } hasoutput = true ; }