如何在UIimageView中缩放UIimage的位置获取像素颜色

我目前正在使用这种技术来获取UIimage中像素的颜色。 (在Ios上)

- (UIColor*) getPixelColorAtLocation:(CGPoint)point { UIColor* color = nil; CGImageRef inImage = self.image.CGImage; // Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpa, Red, Green, Blue CGContextRef cgctx = [self createARGBBitmapContextFromImage:inImage]; if (cgctx == NULL) { return nil; /* error */ } size_t w = CGImageGetWidth(inImage); size_t h = CGImageGetHeight(inImage); CGRect rect = {{0,0},{w,h}}; // Draw the image to the bitmap context. Once we draw, the memory // allocated for the context for rendering will then contain the // raw image data in the specified color space. CGContextDrawImage(cgctx, rect, inImage); // Now we can get a pointer to the image data associated with the bitmap // context. unsigned char* data = CGBitmapContextGetData (cgctx); if (data != NULL) { //offset locates the pixel in the data from x,y. //4 for 4 bytes of data per pixel, w is width of one row of data. int offset = 4*((w*round(point.y))+round(point.x)); int alpha = data[offset]; int red = data[offset+1]; int green = data[offset+2]; int blue = data[offset+3]; NSLog(@"offset: %i colors: RGB A %i %i %i %i",offset,red,green,blue,alpha); color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)]; } // When finished, release the context CGContextRelease(cgctx); // Free image data memory for the context if (data) { free(data); } return color; 

}

如图所示;

http://www.markj.net/iphone-uiimage-pixel-color/

它工作得很好,但是当使用大于UIImageView的图像时,它会失败。 我尝试添加图像并更改缩放模式以适合视图。 我如何修改代码,以便它仍然能够使用缩放图像采样像素颜色。

试试这个swift3

 func getPixelColor(image: UIImage, x: Int, y: Int, width: CGFloat) -> UIColor { let pixelData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage)) let data: UnsafePointer = CFDataGetBytePtr(pixelData) let pixelInfo: Int = ((Int(width) * y) + x) * 4 let r = CGFloat(data[pixelInfo]) / CGFloat(255.0) let g = CGFloat(data[pixelInfo+1]) / CGFloat(255.0) let b = CGFloat(data[pixelInfo+2]) / CGFloat(255.0) let a = CGFloat(data[pixelInfo+3]) / CGFloat(255.0) return UIColor(red: r, green: g, blue: b, alpha: a) } 

这是一个指针:

 0x3A28213A //sorry, I couldn't resist the joke 

现在真实:在浏览了markj.net页面上的评论之后,某位詹姆斯建议进行以下更改:

 size_t w = CGImageGetWidth(inImage); //Written by Mark size_t h = CGImageGetHeight(inImage); //Written by Mark float xscale = w / self.frame.size.width; float yscale = h / self.frame.size.height; point.x = point.x * xscale; point.y = point.y * yscale; 

(感谢http://www.markj.net/iphone-uiimage-pixel-color/comment-page-1/#comment-2159 )

这对我来说实际上并不起作用……不是说我做了很多测试,而且我不是世界上最伟大的程序员(还)…

我的解决方案是以这样的方式缩放UIImageView ,使得图像中的每个像素与屏幕上的标准CGPoint大小相同,然后我就像正常一样使用我的颜色(使用getPixelColorAtLocation:(CGPoint)point ),然后我将图像缩放回我想要的尺寸。

希望这可以帮助!

使用UIImageView层:

 - (UIColor*) getPixelColorAtLocation:(CGPoint)point { UIColor* color = nil; UIGraphicsBeginImageContext(self.frame.size); CGContextRef cgctx = UIGraphicsGetCurrentContext(); if (cgctx == NULL) { return nil; /* error */ } [self.layer renderInContext:cgctx]; unsigned char* data = CGBitmapContextGetData (cgctx); /* ... */ UIGraphicsEndImageContext(); return color; }