• Using PHPickerViewController Images in a Memory-Efficient Way

    PHPickerViewController is (in my opinion) one of the more exciting parts of iOS 14. We developers now have a fully-fledged photo picker that we can just use, rather than having to spend a bunch of our time creating our own (much like SFSafariViewController did for developers and having to write in-app web browsers). Similar to SFSafariViewController it also has terrific privacy benefits, in that previously for our custom UIs, in order to show the pictures to choose from, we had to request access to all the user’s photos, which is not something users or developers really wanted to contend with. PHPickerController works differently in that iOS throws up the picker in a separate process, and the host app only sees the pictures that the user gave the app access to, and not a single one more. Much nicer!

    (Note we did/still do have UIImagePickerController, but many of us didn’t use it due to the missing functionality like selecting multiple photos that PHPickerController does brilliantly.)

    Apollo uses this API in iOS 14 to power its image uploader, so you can upload images directly into your comments or posts.

    How to Use

    The API is even really nice and simple to integrate. The only hitch I ran into is that the API callback when the user selects the photos provides you with essentially a bunch of objects that wrap NSItemProvider objects, which seemed a little intimidating at first glance versus something “simpler” like a bunch of UIImage objects (but there’s good reason they don’t do the latter).

    Presenting the picker in the first place is easy:

    var configuration = PHPickerConfiguration()
    configuration.selectionLimit = 10
    configuration.filter = .images
    configuration.preferredAssetRepresentationMode = .current // Don't bother modifying how they're represented since we're just turning them into Data anyway
    
    let picker = PHPickerViewController(configuration: configuration)
    picker.delegate = self
    present(picker, animated: true, completion: nil)
    

    But acting on the user’s selections is where you can have some trouble:

    func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]) {
        /// What do I do here?! 👉🥺👈
    }
    

    In reality though, it’s not too hard.

    What Not to Do

    My first swing at bat was… not great. If the user selected a bunch of photos to upload and the images were decently sized (say, straight off a modern iPhone camera) the memory footprint of the app could temporarily swell to multiple gigabytes. Yeah, with a g. Caused some crashing and user confusion, understandably, and was quite silly of me.

    At first my naive solution was something along the lines of (simplified):

    var images: [UIImage] = []
            
    for result in results {
        result.itemProvider.loadObject(ofClass: UIImage.self) { (object, error) in
            guard let image = object as? UIImage else { return }
    
            guard let resizedImage: UIImage = UIGraphicsImageRenderer(size: CGSize(width: 2_000, height: 2_000)).image { (context) in
                image.draw(in: CGRect(origin: CGPoint.zero, size: newSize))
            } else { return }
    
            images.append(resizedImage)
        }
    }
    

    Long story short, decoding the potentially large image objects into full-fledged UIImage objects, and especially then going and re-drawing them to resize them is a very memory-expensive operation, which is multiplied with each image. Bad. Don’t do this. I know better. You know better.

    (If you’re curious for more information, Jordan Morgan has a great overview with his try! Swift NYC talk on The Life of an Image and there’s also an excellent WWDC session from 2018 called Image and Graphics Best Practices that goes even more in depth.)

    What You Should Do

    It’s a tiny bit longer because we have to dip down into Core Graphics, but don’t fret, it’s really not that bad. I’ll break it down.

    let dispatchQueue = DispatchQueue(label: "com.christianselig.Apollo.AlbumImageQueue")
    var selectedImageDatas = [Data?](repeating: nil, count: results.count) // Awkwardly named, sure
    var totalConversionsCompleted = 0
    
    for (index, result) in results.enumerated() {
        result.itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.image.identifier) { (url, error) in
            guard let url = url else {
                dispatchQueue.sync { totalConversionsCompleted += 1 }
                return
            }
            
            let sourceOptions = [kCGImageSourceShouldCache: false] as CFDictionary
            
            guard let source = CGImageSourceCreateWithURL(url as CFURL, sourceOptions) else {
                dispatchQueue.sync { totalConversionsCompleted += 1 }
                return
            }
            
            let downsampleOptions = [
                kCGImageSourceCreateThumbnailFromImageAlways: true,
                kCGImageSourceCreateThumbnailWithTransform: true,
                kCGImageSourceThumbnailMaxPixelSize: 2_000,
            ] as CFDictionary
    
            guard let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, downsampleOptions) else {
                dispatchQueue.sync { totalConversionsCompleted += 1 }
                return
            }
    
            let data = NSMutableData()
            
            guard let imageDestination = CGImageDestinationCreateWithData(data, kUTTypeJPEG, 1, nil) else {
                dispatchQueue.sync { totalConversionsCompleted += 1 }
                return
            }
            
            // Don't compress PNGs, they're too pretty
            let isPNG: Bool = {
                guard let utType = cgImage.utType else { return false }
                return (utType as String) == UTType.png.identifier
            }()
    
            let destinationProperties = [
                kCGImageDestinationLossyCompressionQuality: isPNG ? 1.0 : 0.75
            ] as CFDictionary
    
            CGImageDestinationAddImage(imageDestination, cgImage, destinationProperties)
            CGImageDestinationFinalize(imageDestination)
            
            dispatchQueue.sync {
                selectedImageDatas[index] = data as Data
                totalConversionsCompleted += 1
            }
        }
    }
    

    Break it Down Now

    There’s a bit to unpack here, but I’ll try to hit everything.

    The core concept is we’re no longer loading the full UIImage and/or drawing it into a context each time (which can be monstrously large, and why PHPicker doesn’t just give us UIImage objects), especially because in my case I’m just uploading the Data and getting a resulting URL, I don’t ever need the image. But if you do, creating a UIImage from the smaller CGImage will be much better all the same.

    Okay! So we start off with a queue, and the data to be collected. loadFileRepresentation fires on an async queue, and the docs don’t mention if it executes serially (in practice, it does, but that could change), so create a queue to ensure you’re not writing to this array of Data across multiple threads. Also note that the array itself is set up in a way that we can maintain the order of the images, otherwise the order the user selected the photos in and the order they’re processed in may not line up 1:1. Lastly we keep a separate counter to know when we’re done.

    let dispatchQueue = DispatchQueue(label: "com.christianselig.Apollo.AlbumImageQueue")
    var selectedImageDatas = [Data?](repeating: nil, count: results.count) // Awkwardly named, sure
    var totalConversionsCompleted = 0
    

    Moving onto the main loop, instead of asking NSItemProvider to serve us up a potentially enormous UIImage, we approach more cautiously by requesting a URL to the image in the tmp directory. More freedom.

    result.itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.image.identifier) { (url, error) in
    

    We then go onto create a CGImage but with certain requirements around the image size so as to not create something larger than we need. These Core Graphics functions can seem a little intimidating, but between their names and the corresponding docs they paint a clear picture as to what they’re doing.

    let sourceOptions = [kCGImageSourceShouldCache: false] as CFDictionary
    
    guard let source = CGImageSourceCreateWithURL(url as CFURL, sourceOptions) else {
        dispatchQueue.sync { totalConversionsCompleted += 1 }
        return
    }
    
    let downsampleOptions = [
        kCGImageSourceCreateThumbnailFromImageAlways: true,
        kCGImageSourceCreateThumbnailWithTransform: true,
        kCGImageSourceThumbnailMaxPixelSize: 2_000,
    ] as CFDictionary
    
    guard let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, downsampleOptions) else {
        dispatchQueue.sync { totalConversionsCompleted += 1 }
        return
    }
    

    Lastly, we convert this into Data with a bit of compression (only if it’s not a PNG though, PNGs are typically screenshots and whatnot, and I personally don’t want to hurt the quality of those).

    let data = NSMutableData()
    
    guard let imageDestination = CGImageDestinationCreateWithData(data, kUTTypeJPEG, 1, nil) else {
        dispatchQueue.sync { totalConversionsCompleted += 1 }
        return
    }
    
    // Don't compress PNGs, they're too pretty
    let isPNG: Bool = {
        guard let utType = cgImage.utType else { return false }
        return (utType as String) == UTType.png.identifier
    }()
    
    let destinationProperties = [
        kCGImageDestinationLossyCompressionQuality: isPNG ? 1.0 : 0.75
    ] as CFDictionary
    
    CGImageDestinationAddImage(imageDestination, cgImage, destinationProperties)
    CGImageDestinationFinalize(imageDestination)
    

    Now we have much smaller compressed Data objects kicking around, rather than our previously large UIImage objects, and we can POST those to an API endpoint for upload or whatever you’d like! Thanks to everyone on Twitter who gave me pointers here as well. In the end this went from spiking to in excess of 2GB to a small blip of 30MB for a few seconds.

    Adopt this API! It’s great!

  • Apollo for Reddit 1.9

    Apollo 1.9’s a massive update to Apollo that’s taken months and months to complete, but I’m really happy with the result, and it brings together a ton of ideas from the community to make Apollo even nicer to use. The update includes a variety of features around crossposts, flair, new app icons, translation, and quality of life improvements. Thanks to everyone who writes in via email or via the ApolloApp subreddit, your suggestions for what you want to see in Apollo help immensely and really motivate me to keep making Apollo better and better.

    Without further ado, here are the changes included in this 1.9 update to Apollo:

    Crosspost Viewing

    Crossposting (taking an existing post and reposting it to a similar subreddit) has been a big part of Reddit for ages, but recently it became a full-fledged feature where you can see exactly which subreddit it came from, and quickly jump to the original post. Apollo now supports this fully, so you can see the interesting content of the post, but also quickly jump over to read the original discussion! Often it’s like getting two interesting discussions in one!

    Viewing a crosspost in Apollo

    Crossposting

    Similar to being able to view crossposts, you can also easily perform a crosspost if you want as well! Simply select the post you want to crosspost, write a title, select the subreddit to crosspost it to, and bam, you’re off to the races.

    performing a crosspost in Apollo

    Image Flair

    Flair is a little “tag” users can add to their usernames in subreddit, and some subreddits even allow small images/icons to be added in addition to text, like the icon for your favorite sports team, or a character from your favorite TV show. Apollo now shows these beautifully!

    Viewing flair with images in Apollo

    Setting Your Flair

    In addition to being able to view the flair as discussed in the previous item, you can now set your own flair! Simply go to the subreddit of your choosing, and you can choose from a list of customizable flairs so you can add a little personality to your comments, showing which language you’re learning, your username in a video game the subreddit is about, your fitness goals, etc.

    Setting your flair in Apollo

    View Long Flair

    Some users set loooong flair, and as a result it can get off, which can be annoying when you’re trying to figure out what it says. Well be annoyed no longer, for you can simply tap on the long flair to bring up a window that expands it fully!

    Viewing long flair in Apollo

    Find Posts with Same Flair

    If the subreddit lets users tag their posts with individual flairs (say, being able to tag whether your question is about a certain character, or a certain topic), you can now simply tap on that flair and Apollo will show you all the other posts in the subreddit that have been tagged with that same flair.

    Filtering posts with the same flair in Apollo

    5 (Yeah, Five!) New App Icons!

    This update has taken a ton of time to work on, and as a result I was slightly behind in including the Ultra icons I wanted to include, but as a result there’s now a proper Icon Bonanza, with five new icons being included in this update. The first three are Ultra icons, all made by the same incredibly talented designer, Matthew Skiles, who I’ve been a fan of for a long time. I love how these turned out, we have our beloved Apollo mascot reimagined as an angel, a devil, as well as a zany pilot, all in gorgeous, colorful iconography. But those three icons aren’t all! Next up, we have a beautiful new Apollo icon representing the trans pride flag (originally created by Monica Helms), which came out really awesome and is a great addition. And last but not least, our incredible community designer, FutureIncident, makes his second appearance with the Japanese-inspired Apollo-san icon! I love this set of icons so much, it’s going to be really hard to choose.

    5 new app icons available in this Apollo update

    Easy Language Translation

    Reddit is home to a diverse set of communities that have a variety of fascinating conversations, but sometimes it’s tricky to understand what’s being said if the conversation is in a language you’re not familiar with. Heck, you might even have no idea what the language is! Now Apollo will be able to detect if the language of a comment or post is different than the language of your iOS device, and if so, offer to quickly translate it so you can understand the conversation! It is so handy, whether you’re following a fascinating conversation or even trying to learn a new language!

    Post/comment translation in Apollo

    Fast Subreddit Selector

    Whethering you’re trying to add a single subreddit to a filter, or adding multiple subreddits at a time to a multireddit, Apollo is now even faster at doing these tasks, with an auto-completing window that makes it super fast to search and add subreddits.

    Fast subreddit selector in Apollo

    Total Collapsed Comments & Remembering Collapsed Comments

    Two handy new additions to collapsing comments in Apollo. The first, Apollo will show you at a glance how many comments are in the collapsed conversation, which can be super handy for viewing a comment thread. The second thing, if you collapse a bunch of comments, and then come back to that same comment section later, Apollo will now remember which comments you had collapsed, and keep them collapsed for you!

    Total collapsed comments in Apollo

    New Settings, Filters Tweaks, Bug Fixes, and More!

    A bunch of awesome new settings have been added to Apollo, like being able to disable the auto-looping of videos with audio, or being able to make it so translation options always show up. Filtering is also even more powerful, with your filters being able to target flair and links as well (in addition to the title), and fixes a few filtering bugs. Apollo also now shows videos from Reddit’s experimental ‘RPAN’ service, which is essentially a kind of live stream post that you can now view within Apollo. Of course there’s a bunch of other small bug fixes around Apollo, from the occasional account accidentally signing out, to video bugs, to Apollo quitting in the background when it shouldn’t, as well as a bunch of other small tweaks across the app to improve your quality of life while browsing!

    Thank You!

    I really hope you enjoy the update, thank you for using Apollo! More great things to come!

  • The Case for Getting Rid of TestFlight Review

    I tweeted today about how I think TestFlight review should become a thing of the past and many developers seemed to agree, but some had questions so I wanted to expand on my thoughts a little.

    TestFlight’s awesome. But like App Store submissions, TestFlight betas also require a review by Apple. At first blush, such a review sounds sensical. TestFlight can distribute apps to up to 10,000 users. If that were to run completely unchecked you could have potentially mini-App Stores running around with sketchy apps being distributed to lots of people.

    But the point I’ll try to make in this article is that the current system TestFlight employs doesn’t do much to prevent this, and further creates a lot of friction for legitimate developers.

    The Review Process

    For TestFlight, when you submit a new version number, it requires a new review. But new build numbers do not (build numbers are like a secondary ID for a version as it goes through development). For instance, I could push a new version of Apollo to TestFlight, version 1.8 (build number 50) and it would need review, but builds 51, 52, 53, etc. of the same version do not require any review.

    The Problem

    Do you see the issue here? There’s not really any oversight into what you can change in those new builds. You could completely change your app into something different, upload it under a different build number, and so long as the previous version was approved and you don’t change the version number, you could send the new one out to thousands of people.

    Someone looking to distribute, say a console emulator (that Apple doesn’t allow in the App Store), could upload their app as a fun turtle themed calculator app (TurtleCalc™) and get approved on TestFlight, only to update it into that emulator for build 2 and send it out to thousands of people.

    As a Developer

    On the flip side, for an actual developer with an app on the App Store, it causes a ton of friction, because the other rule of TestFlight is such that once a new version goes live on the App Store, you can’t push any new builds to TestFlight without a new version and starting the review process again.

    So if you find a bug in the public version of your app, and want to beta test the fix, you have to wait a day or two for it to be reviewed by Apple before it can even go into beta testing. A 3-lines-of-code bug fix requires re-review, meanwhile, if you’re a bad actor and you just leave the app in TestFlight without ever pushing it to the App Store, you can just update it endlessly without any review whatsoever.

    That means as a developer you’re stuck in this gamble of “Should I just release it to the App Store without any testing? It’s just a bug fix after all, what could go wrong?” versus “Should I let it keep crashing and wait for the TestFlight review to occur so I can test this new build first, even if it means crashing for days more?”

    In a perfect world, you could push that fix out to testers immediately, validate the fix, then submit it to the App Store.

    As a result you have this system that A) doesn’t seem to do anything to stop people submitting nefarious updates but B) introduces a ton of friction to legitimate developers.

    “It Serves as an Early Review for the App Store Before Continued Development”

    Some argue that it lets you “test the waters” with an app or an update before submitting it to the App Store at large. For instance you have an idea that you’re not sure will get through app review, so you build a quick version of the app, and submit it to TestFlight, and the review will let you know if Apple will approve it.

    Unfortunately it doesn’t work like that. Getting through TestFlight review has no bearing on getting through the eventual App Store review. I’ve had builds go through TestFlight review, get the stamp of approval, test it in TestFlight for months, and then when I ultimately submit it the update gets rejected.

    TestFlight reviews are not at all an accurate way to gauge what the reviewers will think. It’s far more lax.

    It Often Requires Double Review

    Even more confusingly, if I decide to take the gamble and just release the bug fix to the App Store and hope all goes well, it’ll goes through a quick review, then it will go live on the App Store.

    But if I want the TestFlight users to use that same version that just got approved, they straight up can’t. Even though it went through the more strict public App Store review, the exact same build has to be reviewed separately for TestFlight. This adds a confusing delay for testers (not to mention extra work for Apple) and is very weird.

    TestFlight Review Takes Longer than App Store Review

    Despite being more lax a review process (as shown above), it takes longer to review. This kinda makes sense, you would hope the majory of staff would be focused on the public App Store review which affects the most users, but it feels bizarre to submit an app to the App Store and TestFlight at the same time (because double review) and the App Store version goes out the same day while the TestFlight version takes a day or two.

    This greatly disincentivizes testing builds when the process to actually get them out takes so long.

    There’s Already Workarounds

    A lot of developers, aware of the above constraints, employ strategies for getting around this process almost completely.

    • As soon as you submit the version to the App Store, you can immediately submit the same version plus one (so 1.8.3 on the App Store, 1.8.4 on TestFlight) even without any changes (just a bumped version number), get it through review, and then the next time you need to test a beta build you have an approved version you can start shoveling new builds onto.
    • An even more clever method some employ, is to just have an astronomically high version number only for TestFlight. So if your App Store version is 1.8, your TestFlight version is 1,0000. That way your TestFlight build is always ahead of the App Store version, and once that version gets approved the first time, you can indefinitely add new builds onto it. A lot of developers do this, and it’s clever, but I personally fear angering the App Store folks.

    You might be asking, “Okay… why not just do one of those methods then?”. And you totally can, but in neither case is the app actually being reviewed, in the first it’s an identical version that’s tweaked “secretly” later, and in the second it’s a single version that gets tweaked forever. This effectively shows how little the review process actually contributes.

    Getting Rid of TestFlight Review Could Speed up Normal Review

    If TestFlight review were to go away for the reasons outlined above, all the awesome folks on that team could be relocated to the “normal” App Store team, which could see an even faster review process. The review process is so much better now than it has been in the past, typically under a day (it used to be over a week!), but can you imagine submitting a build and it being available within a few hours being the norm? That would be fantastic!

    Solution

    I think just getting rid of it completely is fair. As shown, the current process does next to nothing to prevent people from distributing questionable builds, and instead is just a pain for legitimate developers.

    Is it possible that behind the scenes Apple re-reviews builds and might yank them if they find out they break their rules, say a game console app that’s been getting new builds but no new reviews from Apple for a year? Totally! And I think that’s the system they should simply extend everywhere.

    Do away with the review system all together, and have a random review process that occurs after the fact, every so often, perhaps transparently and based on the amount of testers in the beta (a beta with 8,000 users is more dangerous than one with three people).

    So you submit your version, it immediately goes out to all testers, and then a little while after Apple might flag it for random review. If it passes, it’s completely transparent to you. If it gets rejected, it’ll be pulled.

    End

    TestFlight’s great and I love it, but decreasing friction in beta testing would be a massive help.

  • Announcing Apollo: a new Reddit app for iPhone

    I’m really excited to unveil a project I’ve been working on for the last year or so. It’s called Apollo and it’s a new Reddit app for iPhone.

    I’ve been a Reddit user for about four years now, and the site is a constant source of interesting discussion, hilarity and news for me every day. I’ve never been completely happy with the current Reddit apps out there today, so I set out to scratch an itch and build the best Reddit experience on the iPhone that I could. And I’m really proud of the result.

    Apollo went through a really long design phase, and I sweated every detail. Last Spring I was lucky enough to get an offer to work at Apple as an intern for the summer, which meant no time for developing apps for a few months. But after that summer, I learned so much from so many smart people, had a really cool new language to experiment with and my motivation to build something incredible had never been higher.

    Since then I’ve been working super hard to build this app, and today I’m finally at a stage where I can comfortably announce it. It’s not available yet, and won’t be for a little while yet, but it’s getting close and I’d love to have some input. (I made a Reddit thread here) I’ll also be launching a public beta in the coming weeks, so keep an eye out for that if you want to get an early look at what’s to come.

    Apollo's frontpage and inbox

    I really put an emphasis on making Apollo feel at home on the iPhone with a super comfortable browsing experience. It has beautiful, large images, smooth gestures, really nicely organized comments and I baked in a lot of the great features that iOS 8 brought about. There’s a ton more as well. I also made sure that it took advantage of as many of Reddit’s great features as possible.

    From a technical standpoint, it’s built for the most part in Swift. I’ve been really happy with the language so far (bar a few issuse) and it was awesome to build an app with it.

    I’ve made a page where you can find out more about it, and if you’d like sign up to be notified when it’s released: https://apolloapp.io

    I’d love to hear input. You can reach me on Twitter, post in the Reddit thread or email me if you’d like. I’ll also be posting updates on my dribbble page.

    Can’t wait to share more in the coming weeks!