UIImageJPEGRepresentation占用巨大的内存

我试图找出这个问题,但做了每一件事情后,我发现在操作系统或谷歌失败。 问题是,当我使用UIImageJPEGRepresentationUIImagePNGRepresentationUIImage转换为NSData它将内存大小增加到30Mb(相信我或不)。
这是我的代码

 myImage= image; LoginSinglton*loginObj = [LoginSinglton sharedInstance]; NSError *error; NSData *pngData = UIImageJPEGRepresentation(image, scaleValue); //scaleVale is 1. NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsPath = [paths objectAtIndex:0]; //Get the docs directory self.imageCurrentDateAndTime =[self getTimeAndDate]; self.filePathAndDirectory = [documentsPath stringByAppendingPathComponent:@"Photos Dir"]; NSLog(@"Documents path %@",self.filePathAndDirectory); if (![[NSFileManager defaultManager] createDirectoryAtPath:self.filePathAndDirectory withIntermediateDirectories:NO attributes:nil error:&error]) { NSLog(@"Create directory error: %@", error); } self.imageName= [NSString stringWithFormat:@"photo-%@-%@.jpg",loginObj.userWebID,self.imageCurrentDateAndTime]; NSString *filePath = [self.filePathAndDirectory stringByAppendingPathComponent:self.imageName]; [pngData writeToFile:filePath atomically:YES]; //Write the file [self writeImageThumbnailToFolder:image]; [self writeImageHomeViewThumbnailToFolder:image]; 

我也试过以下解决scheme以及UIImageJPEGRepresentation – 内存释放问题
1-使用@autoreleasepool
2-完成pngData =零;
但仍然面临着这个记忆问题。

编辑我想我不能传达我的问题。 如果UIImageJPEGRepresentation占用了大量的内存,那么UIImageJPEGRepresentation但是在保存图像之后内存应该回到它之前的位置。 希望这会帮助你详细。

使用小于1的scaleValue 。即使是0.9,也会极大地减less内存占用,并减less质量损失。

尝试这个:

 UIGraphicsBeginImageContext(newSize); [image drawInRect:CGRectMake(0,0,newSize.width,newSize.height)]; UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); 

如果没有得到您的期望的解决scheme,也不用担心这个问题:

 UIImage *small = [UIImage imageWithCGImage:original.CGImage scale:0.25 orientation:original.imageOrientation]; 

在此处获取示例应用程序表单以编辑图像或比例: http : //mattgemmell.com/2010/07/05/mgimageutilities/

使用最小内存resize尝试使用CoreGraphics

所以回答 @Mina纳比勒,看到更多细节的完整答案

 #import <ImageIO/ImageIO.h> -(UIImage*) resizedImageToRect:(CGRect) thumbRect { CGImageRef imageRef = [inImage CGImage]; CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef); if (alphaInfo == kCGImageAlphaNone) alphaInfo = kCGImageAlphaNoneSkipLast; // Build a bitmap context that's the size of the thumbRect CGContextRef bitmap = CGBitmapContextCreate( NULL, thumbRect.size.width, // width thumbRect.size.height, // height CGImageGetBitsPerComponent(imageRef), // really needs to always be 8 4 * thumbRect.size.width, // rowbytes CGImageGetColorSpace(imageRef), alphaInfo ); // Draw into the context, this scales the image CGContextDrawImage(bitmap, thumbRect, imageRef); // Get an image from the context and a UIImage CGImageRef ref = CGBitmapContextCreateImage(bitmap); UIImage* result = [UIImage imageWithCGImage:ref]; CGContextRelease(bitmap); // ok if NULL CGImageRelease(ref); return result; }