如何使用Photos Framework获取相机胶卷中的图像

以下代码加载也位于iCloud或stream图像上的图像。 我们如何才能将search限制在相机胶卷中的图像?

var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: nil) 

添加“相机胶卷”和“照片stream”相册之后,Apple在iOS 8.1中添加了以下PHAssetCollectionSubtypetypes:

  1. PHAssetCollectionSubtypeAlbumMyPhotoStream (与PHAssetCollectionTypeAlbum一起) – 获取照片stream相册。

  2. PHAssetCollectionSubtypeSmartAlbumUserLibrary (与PHAssetCollectionTypeSmartAlbum一起) – 获取相机胶卷相册。

不过,如果这是向后兼容iOS 8.0.x,还没有testing过。

通过一些实验,我们发现了一个未列在文档( assetSource )中的隐藏属性。 基本上,你必须做一个常规的提取请求,然后使用谓词来过滤相机胶卷。 这个值应该是3。

示例代码:

 //fetch all assets, then sub fetch only the range we need var assets = PHAsset.fetchAssetsWithMediaType(PHAssetMediaType.Image, options: fetchOptions) assets.enumerateObjectsUsingBlock { (obj, idx, bool) -> Void in results.addObject(obj) } var cameraRollAssets = results.filteredArrayUsingPredicate(NSPredicate(format: "assetSource == %@", argumentArray: [3])) results = NSMutableArray(array: cameraRollAssets) 

如果您使用自己的PHCachingImageManager而不是共享的PHImageManager实例,那么当您调用requestImageForAsset:targetSize:contentMode:options:resultHandler:您可以在PHImageRequestOptions设置一个选项来指定该图像是本地的。

networkAccessAllowed属性

一个布尔值,指定照片是否可以从iCloud下载所请求的图像。

networkAccessAllowed

讨论

如果是,并且所请求的图像未存储在本地设备上,照片将从iCloud下载图像。 要获得下载进度的通知,请使用progressHandler属性在下载图像时定期提供相册调用的块。 如果NO(默认),并且图像不在本地设备上,则结果处理程序的信息字典中的PHImageResultIsInCloudKey值指示图像不可用,除非启用networking访问。

如果您像我一样searchObjective C代码,并且您还没有得到新库/ Photo Framework的答案,因为您正在获取不赞成的AssetsLibrary代码,那么这将帮助您: Swift

全局variables:

 var imageArray: [AnyObject] var mutableArray: [AnyObject] convenience func getAllPhotosFromCamera() { imageArray = [AnyObject]() mutableArray = [AnyObject]() var requestOptions: PHImageRequestOptions = PHImageRequestOptions() requestOptions.resizeMode = .Exact requestOptions.deliveryMode = .HighQualityFormat requestOptions.synchronous = true var result: PHFetchResult = PHAsset.fetchAssetsWithMediaType(.Image, options: nil) NSLog("%d", Int(result.count)) var manager: PHImageManager = PHImageManager.defaultManager() var images: [AnyObject] = [AnyObject](minimumCapacity: result.count) // assets contains PHAsset objects. var ima: UIImage for asset: PHAsset in result { // Do something with the asset manager.requestImageForAsset(asset, targetSize: PHImageManagerMaximumSize, contentMode: .Default, options: requestOptions, resultHandler: {(image: UIImage, info: [NSObject : AnyObject]) -> void in 

目标C

全局variables:

 NSArray *imageArray; NSMutableArray *mutableArray; 

下面的方法将帮助你:

 -(void)getAllPhotosFromCamera { imageArray=[[NSArray alloc] init]; mutableArray =[[NSMutableArray alloc]init]; PHImageRequestOptions *requestOptions = [[PHImageRequestOptions alloc] init]; requestOptions.resizeMode = PHImageRequestOptionsResizeModeExact; requestOptions.deliveryMode = PHImageRequestOptionsDeliveryModeHighQualityFormat; requestOptions.synchronous = true; PHFetchResult *result = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil]; NSLog(@"%d",(int)result.count); PHImageManager *manager = [PHImageManager defaultManager]; NSMutableArray *images = [NSMutableArray arrayWithCapacity:[result count]]; // assets contains PHAsset objects. __block UIImage *ima; for (PHAsset *asset in result) { // Do something with the asset [manager requestImageForAsset:asset targetSize:PHImageManagerMaximumSize contentMode:PHImageContentModeDefault options:requestOptions resultHandler:^void(UIImage *image, NSDictionary *info) { ima = image; [images addObject:ima]; }]; } imageArray = [images copy]; // You can direct use NSMutuable Array images } 

这可以帮助。 您可以使用您自己的数据模型,而不是我使用的AlbumModel。

 func getCameraRoll() -> AlbumModel { var cameraRollAlbum : AlbumModel! let cameraRoll = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil) cameraRoll.enumerateObjects({ (object: AnyObject!, count: Int, stop: UnsafeMutablePointer) in if object is PHAssetCollection { let obj:PHAssetCollection = object as! PHAssetCollection let fetchOptions = PHFetchOptions() fetchOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: false)] fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue) let assets = PHAsset.fetchAssets(in: obj, options: fetchOptions) if assets.count > 0 { let newAlbum = AlbumModel(name: obj.localizedTitle!, count: assets.count, collection:obj, assets: assets) cameraRollAlbum = newAlbum } } }) return cameraRollAlbum } 

这里是苹果提供的Objective-c版本。

 -(NSMutableArray *)getNumberOfPhotoFromCameraRoll:(NSArray *)array{ PHFetchResult *fetchResult = array[1]; int index = 0; unsigned long pictures = 0; for(int i = 0; i < fetchResult.count; i++){ unsigned long temp = 0; temp = [PHAsset fetchAssetsInAssetCollection:fetchResult[i] options:nil].count; if(temp > pictures ){ pictures = temp; index = i; } } PHCollection *collection = fetchResult[index]; if (![collection isKindOfClass:[PHAssetCollection class]]) { // return; } // Configure the AAPLAssetGridViewController with the asset collection. PHAssetCollection *assetCollection = (PHAssetCollection *)collection; PHFetchResult *assetsFetchResult = [PHAsset fetchAssetsInAssetCollection:assetCollection options:nil]; self. assetsFetchResults = assetsFetchResult; self. assetCollection = assetCollection; self.numberOfPhotoArray = [NSMutableArray array]; for (int i = 0; i<[assetsFetchResult count]; i++) { PHAsset *asset = assetsFetchResult[i]; [self.numberOfPhotoArray addObject:asset]; } NSLog(@"%lu",(unsigned long)[self.numberOfPhotoArray count]); return self.numberOfPhotoArray; } 

在哪里可以抓住以下细节

 PHFetchResult *fetchResult = self.sectionFetchResults[1]; PHCollection *collection = fetchResult[6]; **value 1,6 used to get camera images** **value 1,0 used to get screen shots** **value 1,1 used to get hidden** **value 1,2 used to get selfies** **value 1,3 used to get recently added** **value 1,4 used to get videos** **value 1,5 used to get recently deleted** **value 1,7 used to get favorites** 

苹果演示链接

声明你的财产

 @property (nonatomic, strong) NSArray *sectionFetchResults; @property (nonatomic, strong) PHFetchResult *assetsFetchResults; @property (nonatomic, strong) PHAssetCollection *assetCollection; @property (nonatomic, strong) NSMutableArray *numberOfPhotoArray; 

我也一直在嘲笑我。 我发现没有办法使用fetchAssetsWithMediaType或fetchAssetsInAssetCollection筛选设备上的资产。 我可以使用requestContentEditingInputWithOptions或requestImageDataForAsset来确定资产是否在设备上,但是这是asynchronous的,似乎使用了太多的资源来处理列表中的每个资源。 一定会有更好的办法。

 PHFetchResult *fetchResult = [PHAsset fetchAssetsWithMediaType:PHAssetMediaTypeImage options:nil]; for (int i=0; i<[fetchResult count]; i++) { PHAsset *asset = fetchResult[i]; [asset requestContentEditingInputWithOptions:nil completionHandler:^(PHContentEditingInput *contentEditingInput, NSDictionary *info) { if ([[info objectForKey:PHContentEditingInputResultIsInCloudKey] intValue] == 1) { NSLog(@"asset is in cloud"); } else { NSLog(@"asset is on device"); } }]; } 

如果您不想依赖未[asset canPerformEditOperation:PHAssetEditOperationContent] API,请查看[asset canPerformEditOperation:PHAssetEditOperationContent] 。 如果设备上有完整的原件,则只返回true。

无可否认,这也是脆弱的,但testing显示,它适用于所有assetSourcetypes(photostream,iTunes同步等)。