Open Search

Swift Photos Framework

June 4, 2018 10:40 am
Categorised in:
Reading Time: 3 minutes

Some concepts I’ve learnt that I need to try and remember about using the Photos framework for iOS.  Thought I’d jot them whilst the were fresh from moving some legacy code this weekend from the ALAssetsLibrary to using the Photos framework which was introduced a good few years ago.

A photo as a container

The first thing to remember is that you aren’t dealing with individual retrieval of images. With PHAsset you are dealing with returning a box of stuff related to the image. The image itself is an item in that box.

What does this mean in practice? Well, for example, when saving an image from a UIImagePickerController, the Exif data isn’t immediately saved. This is because the metadata isn’t saved to the image when using the Photos framework. The metadata is actually a separate entity. Some of which are exposed as properties.

Another example would be a video asset. You can return a poster image frame for a video from the PHAsset, no need to create your own.

You deal with Photos framework a lot like CoreData

You interact with Photos framework via fetch requests to return collections (PHAssetCollection) as objects. Collections can be the users defined albums in the Photos app for for example. Then fetch your PHAssets from that collection. Utilise predicates to sort for media types, geo, etc. So, syntactically very similar.

func findAlbum(albumName: String) -> PHAssetCollection? {
        let fetchOptions = PHFetchOptions()
        fetchOptions.predicate = NSPredicate(format: "title = %@", albumName)
        let fetchResult : PHFetchResult = PHAssetCollection.fetchAssetCollections(with: .album, subtype: .any, options: fetchOptions)
        guard let photoAlbum = fetchResult.firstObject else {
        return nil
        return photoAlbum
    func fetchPhotoAssets(album: PHAssetCollection){
        if self.album != nil{
            self.photoAssets = PHAsset.fetchAssets(in: self.album!, options: nil)

You ask the imageManager to return an image

This allows the system to dynamically generate assets and serve them back to you. This is handy as the actual image may not be on the users device, it may be in their iCloud Photo library. This is the same method for returning a video poster frame from a video asset.

let imageManager = PHImageManager()
imageManager.requestImage(for: asset, targetSize: CGSize(width: 100, height: 100), contentMode: .aspectFill, options: nil, resultHandler: { image, _ in
   if image != nil {
      cell.imageView.image = image

You ask the shared coordinator to make change requests to images

Well, to a PHAsset container. You don’t work with an image directly. This is so you are working in a thread safe way with the image in case it was altered in a different app or something. Also means changes are reflected anywhere using the Photos framework which is pretty handy.

You don’t make image editing changes directly onto the image

The PHAsset container contains tracked changes to the base image so that it can be reverted assuming you create and provide a recipe or use Apple’s default. This seems to be non-destructive. So the user can always revert to the base original image.

Photos framework takes care of a lot of stuff for you

Assets in the Photos framework may not be on the device, they may be stored in the iCloud Photo Library. Changes you make to an asset are reflected across iCloud Photo Library. You get this for free. Similarly if you were to make a photo editing extension, any changes you make on one device will be updated across the users library. Again, this is with no faffing about trying to sync it yourself.

There is probably an awful lot more. No, scratch that, there is definitely an awful lot more but I’ve just realised in June I didn’t finish writing this post and I’m deep in another project and can’t even remember what project I was doing this for. So, I’m just going to post it.



This joint was penned by @elmarko