如何读取iPhone上的RGB像素数据

我想知道如何在iPhone上扫描图像并分析每个像素的RGB值,从而最终确定整个图像的平均RGB值。 如果有人能把我推向正确的方向,将不胜感激。 我是图像分析新手,不确定从哪里开始,或者iOS 5 API中是否包含这样的内容。

只要粘贴,我正在检测触摸的颜色。

- (void) touchesEnded:(NSSet*)touches withEvent:(UIEvent*)event { if (self.view.hidden==YES) { //color wheel is hidden, so don't handle this as a color wheel event. [[self nextResponder] touchesEnded:touches withEvent:event]; return; } UITouch* touch = [touches anyObject]; CGPoint point = [touch locationInView:self.view]; //where image was tapped UIColor * lastColor = [self getPixelColorAtLocation:point]; NSLog(@"color %@",lastColor); UIImageView *lbl=[[UIImageView alloc]initWithFrame:CGRectMake(0, 0, 100, 100)]; lbl.layer.cornerRadius=50; [imageView addSubview:lbl]; lbl.backgroundColor=lastColor; lbl.center=CGPointMake(stillImageFilter.center.x*320, (stillImageFilter.center.y*320)-125); NSLog(@"stillImageCenter = %f,%f",stillImageFilter.center.x,stillImageFilter.center.y);} - (UIColor*) getPixelColorAtLocation:(CGPoint)point { UIColor* color = nil; CGImageRef inImage = imageView.image.CGImage; CGContextRef cgctx = [self createARGBBitmapContextFromImage:inImage]; if (cgctx == NULL) { return nil; /* error */ } size_t w = CGImageGetWidth(inImage); size_t h = CGImageGetHeight(inImage); CGRect rect = {{0,0},{w,h}}; CGContextDrawImage(cgctx, rect, inImage); unsigned char* data = CGBitmapContextGetData (cgctx); if (data != NULL) { int offset = 4*((w*round(point.y))+round(point.x)); int alpha = data[offset]; int red = data[offset+1]; int green = data[offset+2]; int blue = data[offset+3]; NSLog(@"offset: %i colors: RGB A %i %i %i %i",offset,red,green,blue,alpha); color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)]; } CGContextRelease(cgctx); if (data) { free(data); } return color; 

}

 - (CGContextRef) createARGBBitmapContextFromImage:(CGImageRef) inImage { CGContextRef context = NULL; CGColorSpaceRef colorSpace; void * bitmapData; int bitmapByteCount; int bitmapBytesPerRow; size_t pixelsWide = CGImageGetWidth(inImage); size_t pixelsHigh = CGImageGetHeight(inImage); bitmapBytesPerRow = (pixelsWide * 4); bitmapByteCount = (bitmapBytesPerRow * pixelsHigh); colorSpace = CGColorSpaceCreateDeviceRGB(); if (colorSpace == NULL) { fprintf(stderr, "Error allocating color space\n"); return NULL; } bitmapData = malloc( bitmapByteCount ); if (bitmapData == NULL) { fprintf (stderr, "Memory not allocated!"); CGColorSpaceRelease( colorSpace ); return NULL; } context = CGBitmapContextCreate (bitmapData, pixelsWide, pixelsHigh, 8, // bits per component bitmapBytesPerRow, colorSpace, kCGImageAlphaPremultipliedFirst); if (context == NULL) { free (bitmapData); fprintf (stderr, "Context not created!"); } CGColorSpaceRelease( colorSpace ); return context; 

}

看看iOS的相机编程主题 – 拍照和电影,这将得到您的应用程序中的图像。

之后,看看这样的事情: 如何获得在rgb值的像素的图像在iphone上的像素

从UIImage获取CGImage可以给你这个数据

 CFDataRef pixelData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage)); const UInt8* data = CFDataGetBytePtr(pixelData); int pixelInfo = ((image.size.width * y) + x ) * 4; // The image is png UInt8 red = data[pixelInfo]; UInt8 green = data[(pixelInfo + 1)]; UInt8 blue = data[pixelInfo + 2]; UInt8 alpha = data[pixelInfo + 3]; CFRelease(pixelData); 

更多在这里: 从UIImageView获取像素数据 – 在模拟器,而不是设备

在这里: 获取UIImage的像素颜色