GPUImage的GPUImageOpacityFilter行为不如预期,不会更改Alpha通道

我正在尝试使用库存图像的不透明度低于100%的照相机输出的输出来执行库存图像的Overlay Blend 。 我想我可以放置一个GPUImageOpacityFilter在filter堆栈,一切都会好的:

  1. GPUImageVideoCamera – > MY_GPUImageOverlayBlendFilter
  2. GPUImagePicture – > GPUImageOpacityFilter(Opacity 0.1f) – > MY_GPUImageOverlayBlendFilter
  3. MY_GPUImageOverlayBlendFilter – > GPUImageView

但是,结果并不是GPUImagePicture的0.1f alpha版本混合到GPUImageVideoCamera中,它导致了软化GPUImagePicture的颜色/对比度和混合。 所以我做了一些search,并尝试使用imageFromCurrentlyProcessedOutput从GPUImageOpacity筛选器中获取UIImage并将其发送到BlendFilter:

  1. GPUImagePicture – > MY_GPUImageOpacityFilter(Opacity 0.1f)
  2. [MY_GPUImageOpacityFilter imageFromCurrentlyProcessedOutput] – > MY_alphaedImage
  3. GPUImagePicture(MY_alphaedImage) – > MY_GPUImageOverlayBlendFilter
  4. GPUImageVideoCamera – > MY_GPUImageOverlayBlendFilter
  5. MY_GPUImageOverlayBlendFilter – > GPUImageView

这和我所期望的完全一样。 那么,为什么我必须要imageFromCurrentlyProcessedOutput ,不应该只是发生在线? 这里是上面两个场景的代码片段:

第一:

 //Create the GPUPicture UIImage *image = [UIImage imageNamed:@"someFile"]; GPUImagePicture *textureImage = [[[GPUImagePicture alloc] initWithImage:image] autorelease]; //Create the Opacity filter w/0.5 opacity GPUImageOpacityFilter *opacityFilter = [[[GPUImageOpacityFilter alloc] init] autorelease]; opacityFilter.opacity = 0.5f [textureImage addTarget:opacityFilter]; //Create the blendFilter GPUImageFilter *blendFilter = [[[GPUImageOverlayBlendFilter alloc] init] autorelease]; //Point the cameraDevice's output at the blendFilter [self._videoCameraDevice addTarget:blendFilter]; //Point the opacityFilter's output at the blendFilter [opacityFilter addTarget:blendFilter]; [textureImage processImage]; //Point the output of the blendFilter at our previewView GPUImageView *filterView = (GPUImageView *)self.previewImageView; [blendFilter addTarget:filterView]; 

第二个:

 //Create the GPUPicture UIImage *image = [UIImage imageNamed:@"someFile"]; GPUImagePicture *textureImage = [[[GPUImagePicture alloc] initWithImage:image] autorelease]; //Create the Opacity filter w/0.5 opacity GPUImageOpacityFilter *opacityFilter = [[[GPUImageOpacityFilter alloc] init] autorelease]; opacityFilter.opacity = 0.5f [textureImage addTarget:opacityFilter]; //Process the image so we get a UIImage with 0.5 opacity of the original [textureImage processImage]; UIImage *processedImage = [opacityFilter imageFromCurrentlyProcessedOutput]; GPUImagePicture *processedTextureImage = [[[GPUImagePicture alloc] initWithImage:processedImage] autorelease]; //Create the blendFilter GPUImageFilter *blendFilter = [[[GPUImageOverlayBlendFilter alloc] init] autorelease]; //Point the cameraDevice's output at the blendFilter [self._videoCameraDevice addTarget:blendFilter]; //Point the opacityFilter's output at the blendFilter [processedTextureImage addTarget:blendFilter]; [processedTextureImage processImage]; //Point the output of the blendFilter at our previewView GPUImageView *filterView = (GPUImageView *)self.previewImageView; [blendFilter addTarget:filterView];