如何获得imageview ios中的自定义图像的像素颜色值?

我知道以前也有类似的问题。

我想要的是获取图像内的图像的RGB像素值,所以它可以是任何图像,我们想要的像素值。

这是我用来获取点击图像的点。

 - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; if ([touch tapCount] == 2) { //self.imageView.image = nil; return; } CGPoint lastPoint = [touch locationInView:self.imageViewGallery]; NSLog(@"%f",lastPoint.x); NSLog(@"%f",lastPoint.y); } 

并获取图像我已粘贴此代码。

 + (NSArray*)getRGBAsFromImage:(UIImage *)image atX:(int)xx andY:(int)yy count:(int)count { NSMutableArray *result = [NSMutableArray arrayWithCapacity:count]; /** It requires to get the image into your data buffer, so how can we get the `ImageViewImage` **/ CGImageRef imageRef = [image CGImage]; NSUInteger width = CGImageGetWidth(imageRef); NSUInteger height = CGImageGetHeight(imageRef); CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); unsigned char *rawData = (unsigned char*) calloc(height * width * 4, sizeof(unsigned char)); NSUInteger bytesPerPixel = 4; NSUInteger bytesPerRow = bytesPerPixel * width; NSUInteger bitsPerComponent = 8; CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big); CGColorSpaceRelease(colorSpace); CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef); CGContextRelease(context); // Now your rawData contains the image data in the RGBA8888 pixel format. int byteIndex = (bytesPerRow * yy) + xx * bytesPerPixel; for (int ii = 0 ; ii < count ; ++ii) { CGFloat red = (rawData[byteIndex] * 1.0) / 255.0; CGFloat green = (rawData[byteIndex + 1] * 1.0) / 255.0; CGFloat blue = (rawData[byteIndex + 2] * 1.0) / 255.0; CGFloat alpha = (rawData[byteIndex + 3] * 1.0) / 255.0; byteIndex += 4; UIColor *acolor = [UIColor colorWithRed:red green:green blue:blue alpha:alpha]; [result addObject:acolor]; } free(rawData); return result; } 

我是新来的ios,所以请解释,并build议一些教程将是伟大的。

使用这个例子文章 。 这是谈论一个使用图像的颜色select器。 你可以很容易地理解所需的信息。 帮我在我的应用程序。 让我知道如果任何帮助/build议需要.. 🙂

编辑:

更新你的getPixelColorAtLocation:像这样。 它会给你正确的颜色。

 - (UIColor*) getPixelColorAtLocation:(CGPoint)point { UIColor* color = nil; CGImageRef inImage = self.image.CGImage; // Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpa, Red, Green, Blue CGContextRef cgctx = [self createARGBBitmapContextFromImage:inImage]; if (cgctx == NULL) { return nil; /* error */ } size_t w = CGImageGetWidth(inImage); size_t h = CGImageGetHeight(inImage); CGRect rect = {{0,0},{w,h}}; /** Extra Added code for Resized Images ****/ float xscale = w / self.frame.size.width; float yscale = h / self.frame.size.height; point.x = point.x * xscale; point.y = point.y * yscale; /** ****************************************/ /** Extra Code Added for Resolution ***********/ CGFloat x = 1.0; if ([self.image respondsToSelector:@selector(scale)]) x = self.image.scale; /*********************************************/ // Draw the image to the bitmap context. Once we draw, the memory // allocated for the context for rendering will then contain the // raw image data in the specified color space. CGContextDrawImage(cgctx, rect, inImage); // Now we can get a pointer to the image data associated with the bitmap // context. unsigned char* data = CGBitmapContextGetData (cgctx); if (data != NULL) { //offset locates the pixel in the data from x,y. //4 for 4 bytes of data per pixel, w is width of one row of data. // int offset = 4*((w*round(point.y))+round(point.x)); int offset = 4*((w*round(point.y))+round(point.x))*x; //Replacement for Resolution int alpha = data[offset]; int red = data[offset+1]; int green = data[offset+2]; int blue = data[offset+3]; NSLog(@"offset: %i colors: RGB A %i %i %i %i",offset,red,green,blue,alpha); color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)]; } // When finished, release the context CGContextRelease(cgctx); // Free image data memory for the context if (data) { free(data); } return color; } 

让我知道这个修复不工作.. 🙂

这是我的代码的GitHub 。 用它来实现选取器图像。 让我知道是否需要更多的信息

只要使用这个方法,它适用于我:

 - (UIColor*) getPixelColorAtLocation:(CGPoint)point { UIColor* color = nil; CGImageRef inImage; inImage = imgZoneWheel.image.CGImage; // Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpa, Red, Green, Blue CGContextRef cgctx = [self createARGBBitmapContextFromImage:inImage]; if (cgctx == NULL) { return nil; /* error */ } size_t w = CGImageGetWidth(inImage); size_t h = CGImageGetHeight(inImage); CGRect rect = {{0,0},{w,h}}; // Draw the image to the bitmap context. Once we draw, the memory // allocated for the context for rendering will then contain the // raw image data in the specified color space. CGContextDrawImage(cgctx, rect, inImage); // Now we can get a pointer to the image data associated with the bitmap // context. unsigned char* data = CGBitmapContextGetData (cgctx); if (data != NULL) { //offset locates the pixel in the data from x,y. //4 for 4 bytes of data per pixel, w is width of one row of data. int offset = 4*((w*round(point.y))+round(point.x)); alpha = data[offset]; int red = data[offset+1]; int green = data[offset+2]; int blue = data[offset+3]; color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)]; } // When finished, release the context //CGContextRelease(cgctx); // Free image data memory for the context if (data) { free(data); } return color; } 

就像这样使用:

 UIColor *color = [self getPixelColorAtLocation:lastPoint]; 

如果您尝试通过CGBitmapContextGetData从图像中获取颜色,并设置(例如,在背景中),则在iPhone 6及更高版本中这将是不同的颜色。 在iPhone 5中一切都会好的)关于这个的更多信息在iOS应用程序中获取正确的颜色

这个解决scheme通过UIImage给你正确的颜色:

 - (UIColor *)getColorFromImage:(UIImage *)image pixelPoint:(CGPoint)pixelPoint { CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], CGRectMake(pixelPoint.x, pixelPoint.y, 1.f, 1.f)); UIImage *croppedImage = [UIImage imageWithCGImage:imageRef]; CGImageRelease(imageRef); return [UIColor colorWithPatternImage:croppedImage]; } 

你想知道如何从图像视图中获取图像?

UIImageView有一个属性,图像。 只需使用该属性。