GPUImageMovie不支持Alpha通道?

我通过GPUImage创buildvideo效果

self.overlayerView = [[GPUImageView alloc] init]; self.overlayerView.frame = self.view.frame; dispatch_queue_t queue = dispatch_queue_create("queue", NULL); dispatch_async(queue, ^{ NSURL *sourceURL = [[NSBundle mainBundle] URLForResource:@"212121" withExtension:@"mp4"]; GPUImageMovie *sourceMovie = [[GPUImageMovie alloc] initWithURL:sourceURL]; sourceMovie.playAtActualSpeed = YES; sourceMovie.shouldRepeat = YES; sourceMovie.shouldIgnoreUpdatesToThisTarget = YES; NSURL *maskURL = [[NSBundle mainBundle] URLForResource:@"rose" withExtension:@"mp4"]; GPUImageMovie *maskMovie = [[GPUImageMovie alloc] initWithURL:maskURL]; maskMovie.playAtActualSpeed = YES; maskMovie.shouldRepeat = YES; NSURL *alphaURL = [[NSBundle mainBundle] URLForResource:@"rose_alpha" withExtension:@"mp4"]; GPUImageMovie *alphaMovie = [[GPUImageMovie alloc] initWithURL:alphaURL]; alphaMovie.playAtActualSpeed = YES; alphaMovie.shouldRepeat = YES; NSURL *topURL = [[NSBundle mainBundle] URLForResource:@"screen" withExtension:@"mp4"]; GPUImageMovie *topMovie = [[GPUImageMovie alloc] initWithURL:topURL]; topMovie.playAtActualSpeed = YES; topMovie.shouldRepeat = YES; filter0 = [[GPUImageThreeInputFilter alloc] initWithFragmentShaderFromString:@"precision highp float;uniform sampler2D inputImageTexture;uniform sampler2D inputImageTexture2;uniform sampler2D inputImageTexture3;varying vec2 textureCoordinate;void main(){vec4 video=texture2D(inputImageTexture,textureCoordinate);vec4 mv=texture2D(inputImageTexture2, textureCoordinate);vec4 alpha = texture2D(inputImageTexture3, textureCoordinate);gl_FragColor = video * (1.0 - alpha.r) + mv;}"]; filter1 = [[GPUImageTwoInputFilter alloc] initWithFragmentShaderFromString:@"\nprecision highp float;\nuniform sampler2D inputImageTexture; //video\nuniform sampler2D inputImageTexture2; //screen\nvarying vec2 textureCoordinate;\nvoid main()\n{\nvec4 video = texture2D(inputImageTexture, textureCoordinate);\nvec4 screen = texture2D(inputImageTexture2, textureCoordinate);\nmediump vec4 whiteColor = vec4(1.0);\ngl_FragColor = whiteColor - ((whiteColor - screen) * (whiteColor - video));\n}"]; [sourceMovie addTarget:filter0]; [maskMovie addTarget:filter0]; [alphaMovie addTarget:filter0]; [filter0 addTarget:filter1]; [topMovie addTarget:filter1]; [sourceMovie startProcessing]; [alphaMovie startProcessing]; [maskMovie startProcessing]; [topMovie startProcessing]; [filter0 forceProcessingAtSize:CGSizeMake(480,480)]; [filter1 forceProcessingAtSize:CGSizeMake(480,480)]; dispatch_async(dispatch_get_main_queue(), ^{ [filter1 addTarget:self.overlayerView]; }); }); 

代码可以运行,我得到了这样的video效果 在这里输入图像说明

该video有一个黑色的背景,是因为alphaMovie不与maskMovie同时播放?

这是我想要创build的 在这里输入图像说明

效果video没有黑色的背景

题:

1:如何删除黑色背景?

2:为什么我的效果video有黑色背景?

GPUImage框架不包含像这样的alpha通道function的支持。 有一个绿色屏幕function,所以如果您预先制作绿色屏幕video,则可以从绿色屏幕背景分割video。 但是,你在这里描述的是一个alpha通道video和第二个video,它不能正常工作,因为你是从两个不同的video源同时拉,他们不会保持同步。 请注意,即使使用绿色屏幕function,也存在确切的边缘问题,如本博客文章所述 (包括源代码)。 基本问题是靠近绿色但不完全是绿色的边缘可以通过滤波器斜坡以奇数方式处理。 另一种您可能想要考虑的方法是在尝试使用iOSvideo逻辑播放video之前,将N帧预先分解为1个video。