使用Base64和JSON上传大图片

我正在使用此function上传图像到服务器使用JSON 。 为了做到这一点,我首先将图像转换为NSData ,然后使用Base64转换为NSString 。 该方法工作正常,当图像不是很大,但当我试图上传一个2Mb图像,它崩溃。

问题是即使调用didReceiveResponse方法以及返回(null)didReceiveData ,服务器也不会收到我的图像。 起初我以为这是一个时间问题,但即使设置为1000.0它仍然无法正常工作。 任何想法? 谢谢你的时间!

这是我现在的代码:

  - (void) imageRequest { NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:[NSURL URLWithString:@"http://www.myurltouploadimage.com/services/v1/upload.json"]]; NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]; NSString *path = [NSString stringWithFormat:@"%@/design%i.png",docDir, designNum]; NSLog(@"%@",path); NSData *imageData = UIImagePNGRepresentation([UIImage imageWithContentsOfFile:path]); [Base64 initialize]; NSString *imageString = [Base64 encode:imageData]; NSArray *keys = [NSArray arrayWithObjects:@"design",nil]; NSArray *objects = [NSArray arrayWithObjects:imageString,nil]; NSDictionary *jsonDictionary = [NSDictionary dictionaryWithObjects:objects forKeys:keys]; NSError *error; NSData *jsonData = [NSJSONSerialization dataWithJSONObject:jsonDictionary options:kNilOptions error:&error]; [request setHTTPMethod:@"POST"]; [request setValue:[NSString stringWithFormat:@"%d",[jsonData length]] forHTTPHeaderField:@"Content-Length"]; [request setValue:@"application/json" forHTTPHeaderField:@"Accept"]; [request setValue:@"application/json" forHTTPHeaderField:@"Content-Type"]; [request setHTTPBody:jsonData]; [[NSURLConnection alloc] initWithRequest:request delegate:self]; NSLog(@"Image uploaded"); } - (void) connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response { NSLog(@"didReceiveResponse"); } - (void) connection:(NSURLConnection *)connection didReceiveData:(NSData *)data { NSLog(@"%@",[NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:nil]); } 

我终于决定上传Base64图像,将其分割成更小的子string。 为了做到这一点,并且因为我需要许多NSURLConnections ,我创build了一个名为TagConnection的子类,它为每个连接提供了一个标记,以便它们之间不会混淆。

然后,我在MyViewController创build了一个TagConnection属性,用于从任何函数访问它。 正如你所看到的,有一个-startAsyncLoad:withTag:函数,用于TagConnection-connection:didReceiveData: TagConnection-connection:didReceiveData: TagConnection -connection:didReceiveData:当我从服务器收到响应时删除它。

参考-uploadImage函数,首先将图片转换为string,然后将其分割并放入JSON请求中。 它被调用,直到variables偏移量大于string长度,这意味着所有的块已经被上传。

您也可以通过每次检查服务器响应来certificate每个块已经成功上传,只有在成功返回时才调用-uploadImage函数。

我希望这是一个有用的答案。 谢谢。

TagConnection.h

 @interface TagConnection : NSURLConnection { NSString *tag; } @property (strong, nonatomic) NSString *tag; - (id)initWithRequest:(NSURLRequest *)request delegate:(id)delegate startImmediately:(BOOL)startImmediately tag:(NSString*)tag; @end 

TagConnection.m

 #import "TagConnection.h" @implementation TagConnection @synthesize tag; - (id)initWithRequest:(NSURLRequest *)request delegate:(id)delegate startImmediately:(BOOL)startImmediately tag:(NSString*)tag { self = [super initWithRequest:request delegate:delegate startImmediately:startImmediately]; if (self) { self.tag = tag; } return self; } - (void)dealloc { [tag release]; [super dealloc]; } @end 

MyViewController.h

 #import "TagConnection.h" @interface MyViewController : UIViewController @property (strong, nonatomic) TagConnection *conn; 

MyViewController.m

 #import "MyViewController.h" @interface MyViewController () @end @synthesize conn; bool stopSending = NO; int chunkNum = 1; int offset = 0; - (IBAction) uploadImageButton:(id)sender { [self uploadImage]; } - (void) startAsyncLoad:(NSMutableURLRequest *)request withTag:(NSString *)tag { self.conn = [[[TagConnection alloc] initWithRequest:request delegate:self startImmediately:YES tag:tag] autorelease]; } - (void) uploadImage { NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:[NSURL URLWithString:@"http://www.mywebpage.com/upload.json"] cachePolicy:NSURLRequestUseProtocolCachePolicy timeoutInterval:1000.0]; NSString *docDir = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]; NSString *path = [NSString stringWithFormat:@"%@/design%i.png", docDir, designNum]; NSLog(@"%@",path); NSData *imageData = UIImagePNGRepresentation([UIImage imageWithContentsOfFile:path]); [Base64 initialize]; NSString *imageString = [Base64 encode:imageData]; NSUInteger length = [imageString length]; NSUInteger chunkSize = 1000; NSUInteger thisChunkSize = length - offset > chunkSize ? chunkSize : length - offset; NSString *chunk = [imageString substringWithRange:NSMakeRange(offset, thisChunkSize)]; offset += thisChunkSize; NSArray *keys = [NSArray arrayWithObjects:@"design",@"design_id",@"fragment_id",nil]; NSArray *objects = [NSArray arrayWithObjects:chunk,@"design_id",[NSString stringWithFormat:@"%i", chunkNum],nil]; NSDictionary *jsonDictionary = [NSDictionary dictionaryWithObjects:objects forKeys:keys]; NSError *error; NSData *jsonData = [NSJSONSerialization dataWithJSONObject:jsonDictionary options:kNilOptions error:&error]; [request setHTTPMethod:@"POST"]; [request setValue:[NSString stringWithFormat:@"%d",[jsonData length]] forHTTPHeaderField:@"Content-Length"]; [request setValue:@"application/json" forHTTPHeaderField:@"Accept"]; [request setValue:@"application/json" forHTTPHeaderField:@"Content-Type"]; [request setHTTPBody:jsonData]; [self startAsyncLoad:request withTag:[NSString stringWithFormat:@"tag%i",chunkNum]]; if (offset > length) { stopSending = YES; } } - (void) connection:(NSURLConnection *)connection didReceiveData:(NSData *)data { NSError *error; NSArray *responseData = [NSJSONSerialization JSONObjectWithData:data options:kNilOptions error:&error]; if (!responseData) { NSLog(@"Error parsing JSON: %@", error); } else { if (stopSending == NO) { chunkNum++; [self.conn cancel]; self.conn = nil; [self uploadImage]; } else { NSLog(@"---------Image sent---------"); } } } @end 

请不要以为这是最后的select,这只是我的观察。

我认为你应该发送该数据块,而不是完整的数据。 我在YouTubevideo上传案例中看到了这样的方法。他们在许多NSData的块中发送大量的NSData(video文件NSData)。

他们使用相同的方法来上传大数据。

所以应该谷歌有关Youtube数据上传API。你应该search出来的方法,YouTube上传者使用。

我希望它可以帮助你。

Interesting Posts