CVOpenGLESTextureCacheCreateTextureFromImage返回错误6683

我目前正在尝试使用YUV420格式(双平面)在openGL中绘制图像。 我收到原始数据,并试图将其parsing为CVPixelBuffer,然后使用CVOpenGLESTextureCacheCreateTextureFromImage传递所述缓冲区。 虽然在parsing到CVPixelBuffer时我没有收到任何错误,但在尝试传入CVOpenGLESTextureCacheCreateTextureFromImage时收到错误(-6683)。 我正在尽我所能去关注苹果的GLCameraRipple示例代码 – 除此之外,我使用的是原始图像数据,而不是来自相机的数据。

希望有人能解释我在这里错过了什么 – 我认为这是一个缺失的属性…

FYI中,平面0是Y平面,平面1是UV平面 – 其中UV平面应该是Y平面宽度和高度的一半。

size_t numPlanes = image->GetNumPlanes(); size_t planeWidth[numPlanes]; size_t planeHeight[numPlanes]; size_t scanWidth[numPlanes]; void *planeIndex[numPlanes]; for(int i = 0; i<numPlanes; i++){ i<1 ? planeWidth[i] = image->GetWidth() : planeWidth[i] = image->GetWidth()/2; i<1 ? planeHeight[i] = image->GetHeight() : planeWidth[i] = image->GetHeight()/2; scanWidth[i] = image->GetScanWidth(i); planeIndex[i] = image->GetPlanePointer(i); } CVPixelBufferRef pixelBuffer; CFDictionaryRef empty; CFMutableDictionaryRef attrs; empty = CFDictionaryCreate(kCFAllocatorDefault, NULL, NULL, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); attrs = CFDictionaryCreateMutable(kCFAllocatorDefault, 1, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks); CFDictionarySetValue(attrs, kCVPixelBufferIOSurfacePropertiesKey, empty); CVReturn cvError = CVPixelBufferCreateWithPlanarBytes(kCFAllocatorDefault, image->GetWidth(), image->GetHeight(), kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, nil, nil, numPlanes, planeIndex, planeWidth, planeHeight, scanWidth, nil, nil, attrs, &pixelBuffer); if(cvError) NSLog(@"Error at CVPixelBufferCreateWithPlanarBytes: %d", cvError); CVReturn err; size_t width = CVPixelBufferGetWidth(pixelBuffer); size_t height = CVPixelBufferGetHeight(pixelBuffer); if (!_videoTextureCache) { NSLog(@"No video texture cache"); return; } if (_bModel == nil || width != _textureWidth || height != _textureHeight) { _textureWidth = width; _textureHeight = height; _bModel = [[BufferModel alloc] initWithScreenWidth:_screenWidth screenHeight:_screenHeight meshFactor:_meshFactor textureWidth:_textureWidth textureHeight:_textureHeight]; [self setupBuffers]; } [self cleanUpTextures]; // CVOpenGLESTextureCacheCreateTextureFromImage will create GLES texture // optimally from CVImageBufferRef. // Y-plane glActiveTexture(GL_TEXTURE0); err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RED_EXT, _textureWidth, _textureHeight, GL_RED_EXT, GL_UNSIGNED_BYTE, 0, &_lumaTexture); if (err) { NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err); } 

感谢任何能够提供帮助的人。 虽然我知道有一个类似的问题(不完全相同),说问题也相当老,从来没有得到任何答复。 我希望为我的情况带来更多的运气。

iosurface属性在您创build的CVPixelBuffer中为null。

手动创build:

<CVPixelBuffer 0x1fd52790 width=1280 height=720 pixelFormat=420v iosurface=0x0 planes=2>

由CMSampleBufferGetImageBuffer创build:

<CVPixelBuffer 0x1fd521e0 width=1280 height=720 pixelFormat=420f iosurface=0x21621c54 planes=2>

据我所知,没有解决办法。

如果打算在OpenGL中使用CVPixelBufferCreate ,请使用CVPixelBufferRef 。 它为你创build一个iosurface,不像WithBytes替代品。 缺点是你不能重用你现有的缓冲区。 您必须将现有缓冲区中的数据复制到新分配的缓冲区中。

 // set pixel buffer attributes so we get an iosurface NSDictionary *pixelBufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys: [NSDictionary dictionary], kCVPixelBufferIOSurfacePropertiesKey, nil]; // create planar pixel buffer CVPixelBufferRef pixelBuffer = nil; CVPixelBufferCreate(kCFAllocatorDefault, bufferYUV.width, bufferYUV.height, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, (CFDictionaryRef)pixelBufferAttributes, &pixelBuffer); // lock pixel buffer CVPixelBufferLockBaseAddress(pixelBuffer, 0); // get image details size_t width = CVPixelBufferGetWidth(pixelBuffer); size_t height = CVPixelBufferGetHeight(pixelBuffer); // get plane addresses unsigned char *baseAddressY = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0); unsigned char *baseAddressUV = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1); //TODO: copy your data buffers to the newly allocated memory locations // unlock pixel buffer address CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); // intialize buffers if not already initialized (see GLCameraRipple example) if (!_buffersInitialized) { [self initializeBuffersWithTextureWidth:width textureHeight:height]; } // always clean up last textures CVReturn err; [self cleanUpTextures]; // Y-plane glActiveTexture(GL_TEXTURE0); err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RED_EXT, width, height, GL_RED_EXT, GL_UNSIGNED_BYTE, 0, &_lumaTexture); if (err) { NSLog(@"Could not create Y texture from image. %d", err); } glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture)); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); // UV-plane glActiveTexture(GL_TEXTURE1); err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RG_EXT, width / 2, height / 2, GL_RG_EXT, GL_UNSIGNED_BYTE, 1, &_chromaTexture); if (err) { NSLog(@"Could not create UV texture from image. %d", err); } glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture)); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 

我不在YUV上尝试以下方法,但它适用于RGB情况

https://developer.apple.com/library/ios/qa/qa1781/_index.html

如果启用ARC,则在CFDictionaryRef之前添加__bridge。