OpenGL ES到iOS中的video(渲染为具有iOS 5纹理caching的纹理)

你知道苹果的CameraRipple效果的示例代码? 那么我试图在openGL完成水的所有酷的效果之后将相机输出logging在文件中。

我用glReadPixels完成了这个工作,在这里我读取了void *缓冲区中的所有像素,创buildCVPixelBufferRef并将其附加到AVAssetWriterInputPixelBufferAdaptor中,但速度太慢,因此readPixels需要大量的时间。 我发现使用FBO和纹理现金你可以做同样的事情,但更快。 下面是我在Apple使用的drawInRect方法中的代码:

CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)_context, NULL, &coreVideoTextureCashe); if (err) { NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d"); } CFDictionaryRef empty; // empty value for attr value. CFMutableDictionaryRef attrs2; empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary NULL, NULL, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); attrs2 = CFDictionaryCreateMutable(kCFAllocatorDefault, 1, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); CFDictionarySetValue(attrs2, kCVPixelBufferIOSurfacePropertiesKey, empty); //CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget); CVPixelBufferRef pixiel_bufer4e = NULL; CVPixelBufferCreate(kCFAllocatorDefault, (int)_screenWidth, (int)_screenHeight, kCVPixelFormatType_32BGRA, attrs2, &pixiel_bufer4e); CVOpenGLESTextureRef renderTexture; CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault, coreVideoTextureCashe, pixiel_bufer4e, NULL, // texture attributes GL_TEXTURE_2D, GL_RGBA, // opengl format (int)_screenWidth, (int)_screenHeight, GL_BGRA, // native iOS format GL_UNSIGNED_BYTE, 0, &renderTexture); CFRelease(attrs2); CFRelease(empty); glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture)); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0); CVPixelBufferLockBaseAddress(pixiel_bufer4e, 0); if([pixelAdapter appendPixelBuffer:pixiel_bufer4e withPresentationTime:currentTime]) { float result = currentTime.value; NSLog(@"\n\n\4eta danni i current time e : %f \n\n",result); currentTime = CMTimeAdd(currentTime, frameLength); } CVPixelBufferUnlockBaseAddress(pixiel_bufer4e, 0); CVPixelBufferRelease(pixiel_bufer4e); CFRelease(renderTexture); CFRelease(coreVideoTextureCashe); 

它录制的video很快,但video只是黑色我认为textureCasheRef是不正确的,或者我填错了。

作为更新,这是我尝试过的另一种方式。 我肯定错过了什么。 在viewDidLoad中,我设置openGL上下文后,我这样做:

 CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)_context, NULL, &coreVideoTextureCashe); if (err) { NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d"); } //creats the pixel buffer pixel_buffer = NULL; CVPixelBufferPoolCreatePixelBuffer (NULL, [pixelAdapter pixelBufferPool], &pixel_buffer); CVOpenGLESTextureRef renderTexture; CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault, coreVideoTextureCashe, pixel_buffer, NULL, // texture attributes GL_TEXTURE_2D, GL_RGBA, // opengl format (int)screenWidth, (int)screenHeight, GL_BGRA, // native iOS format GL_UNSIGNED_BYTE, 0, &renderTexture); glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture)); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0); 

然后在drawInRect中:我这样做:

  if(isRecording&&writerInput.readyForMoreMediaData) { CVPixelBufferLockBaseAddress(pixel_buffer, 0); if([pixelAdapter appendPixelBuffer:pixel_buffer withPresentationTime:currentTime]) { currentTime = CMTimeAdd(currentTime, frameLength); } CVPixelBufferLockBaseAddress(pixel_buffer, 0); CVPixelBufferRelease(pixel_buffer); } 

然而,它与renderTexture bad_acsess崩溃,这不是零,但0x000000001。

UPDATE

用下面的代码我实际上设法拉动video文件,但有一些绿色和红色的闪光灯。 我使用BGRA pixelFormatType。

在这里我创build了纹理caching:

 CVReturn err2 = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)_context, NULL, &coreVideoTextureCashe); if (err2) { NSLog(@"Error at CVOpenGLESTextureCacheCreate %d", err); return; } 

然后在drawInRect中调用这个:

 if(isRecording&&writerInput.readyForMoreMediaData) { [self cleanUpTextures]; CFDictionaryRef empty; // empty value for attr value. CFMutableDictionaryRef attrs2; empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary NULL, NULL, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); attrs2 = CFDictionaryCreateMutable(kCFAllocatorDefault, 1, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); CFDictionarySetValue(attrs2, kCVPixelBufferIOSurfacePropertiesKey, empty); //CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget); CVPixelBufferRef pixiel_bufer4e = NULL; CVPixelBufferCreate(kCFAllocatorDefault, (int)_screenWidth, (int)_screenHeight, kCVPixelFormatType_32BGRA, attrs2, &pixiel_bufer4e); CVOpenGLESTextureRef renderTexture; CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault, coreVideoTextureCashe, pixiel_bufer4e, NULL, // texture attributes GL_TEXTURE_2D, GL_RGBA, // opengl format (int)_screenWidth, (int)_screenHeight, GL_BGRA, // native iOS format GL_UNSIGNED_BYTE, 0, &renderTexture); CFRelease(attrs2); CFRelease(empty); glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture)); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0); CVPixelBufferLockBaseAddress(pixiel_bufer4e, 0); if([pixelAdapter appendPixelBuffer:pixiel_bufer4e withPresentationTime:currentTime]) { float result = currentTime.value; NSLog(@"\n\n\4eta danni i current time e : %f \n\n",result); currentTime = CMTimeAdd(currentTime, frameLength); } CVPixelBufferUnlockBaseAddress(pixiel_bufer4e, 0); CVPixelBufferRelease(pixiel_bufer4e); CFRelease(renderTexture); // CFRelease(coreVideoTextureCashe); } 

我知道我可以通过在这里不做所有这些事情来优化这个,但是我想用它来工作。 在cleanUpTextures中,我使用以下方法刷新textureCache:

  CVOpenGLESTextureCacheFlush(coreVideoTextureCashe, 0); 

有些东西可能是错误的RGBA的东西,或者我不知道,但它似乎仍然得到一种错误的caching。

对于录制video,这不是我使用的方法。 你正在为每个渲染帧创build一个新的像素缓冲区,这将是缓慢的,你永远不会释放它,所以毫不奇怪,你得到的内存警告。

相反,按照我在这个答案中描述的。 我为caching纹理创build了一个像素缓冲区,将该纹理分配给正在渲染的FBO,然后使用每个帧的AVAssetWriter的像素缓冲区input添加该像素缓冲区。 使用单个像素缓冲区要比每个帧重新创build一个要快得多。 你也想离开与你的FBO纹理目标相关的像素缓冲区,而不是把它关联到每一帧。

我将GPUImageMovieWriter封装在我的开源GPUImage框架中,如果你想看看它是如何工作的。 正如我在上面的答案中所指出的那样,以这种方式进行logging导致极快的编码。

    Interesting Posts