将UIImage转换为CVImageBufferRef

这个代码主要工作,但结果数据似乎松散的颜色通道(就是我所想的),因为显示的结果图像数据是蓝色的!

这里是代码:

UIImage* myImage=[UIImage imageNamed:@"sample1.png"]; CGImageRef imageRef=[myImage CGImage]; CVImageBufferRef pixelBuffer = [self pixelBufferFromCGImage:imageRef]; 

方法pixelBufferFromCGIImage是从另一个post抓取stackoverflow在这里: 如何导出UIImagearrays作为一个电影? (虽然这个应用程序是不相关的,我试图做)是

 + (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image { CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image)); NSDictionary *options = @{ (__bridge NSString *)kCVPixelBufferCGImageCompatibilityKey: @(NO), (__bridge NSString *)kCVPixelBufferCGBitmapContextCompatibilityKey: @(NO) }; CVPixelBufferRef pixelBuffer; CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width, frameSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options, &pixelBuffer); if (status != kCVReturnSuccess) { return NULL; } CVPixelBufferLockBaseAddress(pixelBuffer, 0); void *data = CVPixelBufferGetBaseAddress(pixelBuffer); CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef context = CGBitmapContextCreate(data, frameSize.width, frameSize.height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), rgbColorSpace, (CGBitmapInfo) kCGImageAlphaNoneSkipLast); CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image); CGColorSpaceRelease(rgbColorSpace); CGContextRelease(context); CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); return pixelBuffer; } 

我认为这与kCVPixelFormatType_32ARGB和kCGImageAlphaNoneSkipLast之间的关系有关,尽pipe我已经尝试了每个组合,并得到相同的结果或应用程序崩溃。 再次,这将UIImage数据到CVImageBufferRef中,但是当我在屏幕上显示图像时,它似乎松了一个颜色通道,并显示为蓝色。 图像是一个PNG。

这听起来可能是这种关系。 可能有一个JPG和RGB,而不是索引的颜色与PNG?

解决scheme是这个代码完美的按预期工作。 :)问题是在使用数据创buildOpenGL纹理。 与此代码完全无关。 任何人都在search如何转换UIImage到CVImageBufferRef,你的答案是在上面的代码!

如果有人仍然在寻找这个问题的解决scheme,我通过在pixelBuffer的选项中切换BOOL来解决它:

 NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:NO], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:NO], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; 

从NO到YES:

 NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil]; 

我遇到同样的问题,并find一些样品: http : //www.cakesolutions.net/teamblogs/2014/03/08/cmsamplebufferref-from-cgimageref
尝试改变

 CGBitmapInfo bitmapInfo = (CGBitmapInfo)kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst) 

这是真正的作品:

 + (CVPixelBufferRef)pixelBufferFromImage:(CGImageRef)image { CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image)); // Not sure why this is even necessary, using CGImageGetWidth/Height in status/context seems to work fine too CVPixelBufferRef pixelBuffer = NULL; CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width, frameSize.height, kCVPixelFormatType_32BGRA, nil, &pixelBuffer); if (status != kCVReturnSuccess) { return NULL; } CVPixelBufferLockBaseAddress(pixelBuffer, 0); void *data = CVPixelBufferGetBaseAddress(pixelBuffer); CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef context = CGBitmapContextCreate(data, frameSize.width, frameSize.height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), rgbColorSpace, (CGBitmapInfo) kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image); CGColorSpaceRelease(rgbColorSpace); CGContextRelease(context); CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); return pixelBuffer; } 

您可以将像素缓冲区更改回UIImage(然后显示或保存)以确认它是否适用于此方法:

 + (UIImage *)imageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer { CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; CIContext *context = [CIContext contextWithOptions:nil]; CGImageRef myImage = [context createCGImage:ciImage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(pixelBuffer), CVPixelBufferGetHeight(pixelBuffer))]; UIImage *image = [UIImage imageWithCGImage:myImage]; // Uncomment the following lines to say the image to your application's document directory //NSString *imageSavePath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"myImageFromPixelBuffer.png"]]; //[UIImagePNGRepresentation(image) writeToFile:imageSavePath atomically:YES]; return image; } 

只是为了澄清上面的答案:我遇到了同样的问题,因为我的着色器代码需要在图像缓冲区内的两个分层样本,而我使用单层缓冲区

这行代表了一个样本的rgb值,并将它们传递给(我不知道是什么),但最终的结果是全彩色的图像。

  gl_FragColor = vec4(texture2D(SamplerY, texCoordVarying).rgb, 1);