iOS中的裁剪video在video周围看到奇怪的绿线

嘿,大家,我正在裁剪从iPhone上的相机拍摄的video,然后裁剪它播放像这样。 当我这样做,但我得到一个奇怪的绿色线周围的video的底部和右侧? 不知道为什么会发生这种情况或如何解决这个问题。 这是我如何裁剪。

- (UIImageOrientation)getVideoOrientationFromAsset:(AVAsset *)asset { AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; CGSize size = [videoTrack naturalSize]; CGAffineTransform txf = [videoTrack preferredTransform]; if (size.width == txf.tx && size.height == txf.ty) return UIImageOrientationLeft; //return UIInterfaceOrientationLandscapeLeft; else if (txf.tx == 0 && txf.ty == 0) return UIImageOrientationRight; //return UIInterfaceOrientationLandscapeRight; else if (txf.tx == 0 && txf.ty == size.width) return UIImageOrientationDown; //return UIInterfaceOrientationPortraitUpsideDown; else return UIImageOrientationUp; //return UIInterfaceOrientationPortrait; } - (AVAssetExportSession*)applyCropToVideoWithAsset:(AVAsset*)asset AtRect:(CGRect)cropRect OnTimeRange:(CMTimeRange)cropTimeRange ExportToUrl:(NSURL*)outputUrl ExistingExportSession:(AVAssetExportSession*)exporter WithCompletion:(void(^)(BOOL success, NSError* error, NSURL* videoUrl))completion { // NSLog(@"CALLED"); //create an avassetrack with our asset AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; //create a video composition and preset some settings AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.frameDuration = CMTimeMake(1, 30); CGFloat cropOffX = cropRect.origin.x; CGFloat cropOffY = cropRect.origin.y; CGFloat cropWidth = cropRect.size.width; CGFloat cropHeight = cropRect.size.height; // NSLog(@"width: %f - height: %f - x: %f - y: %f", cropWidth, cropHeight, cropOffX, cropOffY); videoComposition.renderSize = CGSizeMake(cropWidth, cropHeight); //create a video instruction AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = cropTimeRange; AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack]; UIImageOrientation videoOrientation = [self getVideoOrientationFromAsset:asset]; CGAffineTransform t1 = CGAffineTransformIdentity; CGAffineTransform t2 = CGAffineTransformIdentity; switch (videoOrientation) { case UIImageOrientationUp: t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height - cropOffX, 0 - cropOffY ); t2 = CGAffineTransformRotate(t1, M_PI_2 ); break; case UIImageOrientationDown: t1 = CGAffineTransformMakeTranslation(0 - cropOffX, clipVideoTrack.naturalSize.width - cropOffY ); // not fixed width is the real height in upside down t2 = CGAffineTransformRotate(t1, - M_PI_2 ); break; case UIImageOrientationRight: t1 = CGAffineTransformMakeTranslation(0 - cropOffX, 0 - cropOffY ); t2 = CGAffineTransformRotate(t1, 0 ); break; case UIImageOrientationLeft: t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.width - cropOffX, clipVideoTrack.naturalSize.height - cropOffY ); t2 = CGAffineTransformRotate(t1, M_PI ); break; default: NSLog(@"no supported orientation has been found in this video"); break; } CGAffineTransform finalTransform = t2; [transformer setTransform:finalTransform atTime:kCMTimeZero]; //add the transformer layer instructions, then add to video composition instruction.layerInstructions = [NSArray arrayWithObject:transformer]; videoComposition.instructions = [NSArray arrayWithObject: instruction]; //Remove any prevouis videos at that path [[NSFileManager defaultManager] removeItemAtURL:outputUrl error:nil]; if (!exporter){ exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ; } // assign all instruction for the video processing (in this case the transformation for cropping the video exporter.videoComposition = videoComposition; exporter.outputFileType = AVFileTypeQuickTimeMovie; if (outputUrl){ exporter.outputURL = outputUrl; [exporter exportAsynchronouslyWithCompletionHandler:^{ switch ([exporter status]) { case AVAssetExportSessionStatusFailed: NSLog(@"crop Export failed: %@", [[exporter error] localizedDescription]); if (completion){ dispatch_async(dispatch_get_main_queue(), ^{ completion(NO,[exporter error],nil); }); return; } break; case AVAssetExportSessionStatusCancelled: NSLog(@"crop Export canceled"); if (completion){ dispatch_async(dispatch_get_main_queue(), ^{ completion(NO,nil,nil); }); return; } break; default: break; } if (completion){ dispatch_async(dispatch_get_main_queue(), ^{ completion(YES,nil,outputUrl); }); } }]; } return exporter; } 

然后我玩这个叫这个庄稼

 AVAsset *assest = [AVAsset assetWithURL:self.videoURL]; NSString * documentsPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]; NSString *exportPath = [documentsPath stringByAppendingFormat:@"/croppedvideo.mp4"]; NSURL *exportUrl = [NSURL fileURLWithPath:exportPath]; AVAssetExportSession *exporter = [AVAssetExportSession exportSessionWithAsset:assest presetName:AVAssetExportPresetLowQuality]; [self applyCropToVideoWithAsset:assest AtRect:CGRectMake(self.view.frame.size.width/2 - 57.5 - 5, self.view.frame.size.height / 2 - 140, 115, 85) OnTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(assest.duration.value, 1)) ExportToUrl:exportUrl ExistingExportSession:exporter WithCompletion:^(BOOL success, NSError *error, NSURL *videoUrl) { AVPlayer *player = [AVPlayer playerWithURL:videoUrl]; AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:player]; layer.frame = CGRectMake(125, 365, 115, 115); UIView *view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 400, 400)]; [view.layer addSublayer:layer]; [self.view addSubview:view]; [player play]; 

如果你想testing这个,代码只需要复制和粘贴,然后设置一个video,你会看到我在说什么。

感谢您花时间帮助我,我知道这是一个相当的代码。

iOS编码器或video格式本身都有宽度要求。 试着让你的宽度平均或4整除。

我并没有意识到身高有类似的要求,但这也值得一试。

我从来没有发现它的logging,但要求均匀性有一定的意义,因为h.264使用4:2:0的yuv颜色空间,其中UV分量是Y通道尺寸的一半(在两个维度上)它具有video的整体尺寸。 如果这些尺寸不均匀,则UV尺寸将不是整体。

ps这些暗示是神秘的绿色。 我认为它对应于YUV中的0,0,0。

@节奏的答案拯救了我的一天。

在我的应用程序中,我需要按照屏幕宽度大小的方形video。 所以对于iPhone 5来说,这是320个像素,对于iPhone 6来说,这变成了375个像素。

所以我遇到了同样的绿线问题的iPhone 6尺寸分辨率。 因为它的屏幕尺寸宽度是375像素。 不能被2或4整除。

为了摆脱这个,我们做了这些改变

  AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition]; MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction]; MainInstruction.timeRange = range; MainCompositionInst.frameDuration = VideoFrameDuration; //Constants MainCompositionInst.renderScale = VideoRenderScale; //Constants if ((int)SCREEN_WIDTH % 2 == 0) MainCompositionInst.renderSize = CGSizeMake(SCREEN_WIDTH, SCREEN_WIDTH); else // This does the trick MainCompositionInst.renderSize = CGSizeMake(SCREEN_WIDTH+1, SCREEN_WIDTH+1); 

希望这可以帮助。 只需要添加一个像素,它就可以被2或4整除。

谢谢