iOS中的图像泡沫效应

我有一些想要“投入泡沫”的图像。 这些图像在屏幕上漂浮着气泡。

最好的方法是将内部图像与泡泡图像结合起来,并以某种方式扭曲内部图像,使其看起来像在泡泡内部reflection。

有没有人知道如何实现这个效果,而不使用纹理和网格? 也许有人记得一个旧的项目或做了类似的事情?

这里是我的意思的一个例子:

在这里输入图像说明

你可以使用我的开源GPUImage框架中的GPUImageSphereRefractionFilter来完成这个工作:

球面折射的例子

我详细描述了这个问题在Android的类似情况下如何工作。 基本上,我使用片段着色器来折射穿过虚拟球体的光线,然后使用它来查看包含源图像的纹理。 背景使用简单的高斯模糊来模糊。

如果你想实现你显示的图像的确切外观,你可能需要调整这个片段着色器来为球体添加一些掠过angular的颜色,但是这应该让你相当接近。

为了它的乐趣,我决定尝试更密切地复制上面的玻璃球。 我添加了掠射angular度照明和球面上的镜面光照reflection,以及不折射折射的纹理坐标,导致这样的结果:

放牧角度照明球

我为以下版本使用了以下片段着色器:

  varying highp vec2 textureCoordinate; uniform sampler2D inputImageTexture; uniform highp vec2 center; uniform highp float radius; uniform highp float aspectRatio; uniform highp float refractiveIndex; // uniform vec3 lightPosition; const highp vec3 lightPosition = vec3(-0.5, 0.5, 1.0); const highp vec3 ambientLightPosition = vec3(0.0, 0.0, 1.0); void main() { highp vec2 textureCoordinateToUse = vec2(textureCoordinate.x, (textureCoordinate.y * aspectRatio + 0.5 - 0.5 * aspectRatio)); highp float distanceFromCenter = distance(center, textureCoordinateToUse); lowp float checkForPresenceWithinSphere = step(distanceFromCenter, radius); distanceFromCenter = distanceFromCenter / radius; highp float normalizedDepth = radius * sqrt(1.0 - distanceFromCenter * distanceFromCenter); highp vec3 sphereNormal = normalize(vec3(textureCoordinateToUse - center, normalizedDepth)); highp vec3 refractedVector = 2.0 * refract(vec3(0.0, 0.0, -1.0), sphereNormal, refractiveIndex); refractedVector.xy = -refractedVector.xy; highp vec3 finalSphereColor = texture2D(inputImageTexture, (refractedVector.xy + 1.0) * 0.5).rgb; // Grazing angle lighting highp float lightingIntensity = 2.5 * (1.0 - pow(clamp(dot(ambientLightPosition, sphereNormal), 0.0, 1.0), 0.25)); finalSphereColor += lightingIntensity; // Specular lighting lightingIntensity = clamp(dot(normalize(lightPosition), sphereNormal), 0.0, 1.0); lightingIntensity = pow(lightingIntensity, 15.0); finalSphereColor += vec3(0.8, 0.8, 0.8) * lightingIntensity; gl_FragColor = vec4(finalSphereColor, 1.0) * checkForPresenceWithinSphere; } 

并且可以使用GPUImageGlassSphereFilter运行此筛选器。

对于logging,我结束了使用像@BradLarson GPUImagebuild议,但我不得不写一个自定义filter如下。 这个filter需要一个“内部”图像和一个泡泡纹理,并混合两者,同时也执行折射计算,但不反转图像坐标。 效果:

在这里输入图像说明

。H

 @interface GPUImageBubbleFilter : GPUImageTwoInputFilter @property (readwrite, nonatomic) CGFloat refractiveIndex; @property (readwrite, nonatomic) CGFloat radius; @end 

.M

 #import "GPUImageBubbleFilter.h" NSString *const kGPUImageBubbleShaderString = SHADER_STRING ( varying highp vec2 textureCoordinate; varying highp vec2 textureCoordinate2; uniform sampler2D inputImageTexture; uniform sampler2D inputImageTexture2; uniform highp vec2 center; uniform highp float radius; uniform highp float aspectRatio; uniform highp float refractiveIndex; void main() { highp vec2 textureCoordinateToUse = vec2(textureCoordinate.x, (textureCoordinate.y * aspectRatio + 0.5 - 0.5 * aspectRatio)); highp float distanceFromCenter = distance(center, textureCoordinateToUse); lowp float checkForPresenceWithinSphere = step(distanceFromCenter, radius); distanceFromCenter = distanceFromCenter / radius; highp float normalizedDepth = radius * sqrt(1.0 - distanceFromCenter * distanceFromCenter); highp vec3 sphereNormal = normalize(vec3(textureCoordinateToUse - center, normalizedDepth)); highp vec3 refractedVector = refract(vec3(0.0, 0.0, -1.0), sphereNormal, refractiveIndex); lowp vec4 textureColor = texture2D(inputImageTexture, (refractedVector.xy + 1.0) * 0.5) * checkForPresenceWithinSphere; lowp vec4 textureColor2 = texture2D(inputImageTexture2, textureCoordinate2) * checkForPresenceWithinSphere; gl_FragColor = mix(textureColor, textureColor2, textureColor2.a); } ); @interface GPUImageBubbleFilter () { GLint radiusUniform, centerUniform, aspectRatioUniform, refractiveIndexUniform; } @property (readwrite, nonatomic) CGFloat aspectRatio; @end @implementation GPUImageBubbleFilter @synthesize radius = _radius, refractiveIndex = _refractiveIndex, aspectRatio = _aspectRatio; - (id) init { self = [super initWithFragmentShaderFromString: kGPUImageBubbleShaderString]; if( self ) { radiusUniform = [filterProgram uniformIndex: @"radius"]; aspectRatioUniform = [filterProgram uniformIndex: @"aspectRatio"]; centerUniform = [filterProgram uniformIndex: @"center"]; refractiveIndexUniform = [filterProgram uniformIndex: @"refractiveIndex"]; self.radius = 0.5; self.refractiveIndex = 0.5; self.aspectRatio = 1.0; GLfloat center[2] = {0.5, 0.5}; [GPUImageOpenGLESContext useImageProcessingContext]; [filterProgram use]; glUniform2fv(centerUniform, 1, center); [self setBackgroundColorRed: 0 green: 0 blue: 0 alpha: 0]; } return self; } #pragma mark - Accessors - (void) setRadius:(CGFloat)radius { _radius = radius; [GPUImageOpenGLESContext useImageProcessingContext]; [filterProgram use]; glUniform1f(radiusUniform, _radius); } - (void) setAspectRatio:(CGFloat)aspectRatio { _aspectRatio = aspectRatio; [GPUImageOpenGLESContext useImageProcessingContext]; [filterProgram use]; glUniform1f(aspectRatioUniform, _aspectRatio); } - (void)setRefractiveIndex:(CGFloat)newValue; { _refractiveIndex = newValue; [GPUImageOpenGLESContext useImageProcessingContext]; [filterProgram use]; glUniform1f(refractiveIndexUniform, _refractiveIndex); }