On Camera Comparisons

I’ve done quite a few camera comparisons over the past couple years, and they continue to be one of the more popular things on the site. I find them really interesting because they can lead to really surprising results…in certain situations. Let me explain.

So whenever I share a photo comparison and don’t tell people which is from which camera, the results tend to be very split. People vote for the iPhone shot in one comparison and then choose the Pixel one in the next. Or they are are die-hard iPhone fans, but choose the Pixel photo in every single case (and vice versa). The results are unexpected at times, and always fascinating.

But when I do a comparison and I do tell people which is which, then people who I know to be big Apple/Google fans will always vote for their preferred brand, even if they had chosen the other company’s photos in a blind comparison.

Now not everyone does this, but there are certainly people who I notice do this. Also, as an aggregate, I see different results in comparisons when I tell people what camera took each, rather than let people pick the photo they like better. Really what this gets down to is the fact the photography is very subjective and when we’re looking at the best cameras on the market, it’s just as much about feelings as it is about quality.

How to Record and Edit a Podcast with Ferrite for iOS


I’ve been recording, editing, and posting my podcast straight from iOS for about a year now. The magical elements that let me do this are:

The video should give you a general idea for how the app works, but it does not dive into every single element of the app because frankly, I don’t use a ton of the features available. Hopefully it gets you what you need to get started with the app!

One Slick Android Feature

Much like iOS, Android has an app store that lets people easily download apps for their devices. The differences between the Play Store and App Store are more in execution than concept, and will be how the vast majority of people get apps.

But Android also has the ability to simple install apps like you can on a Mac or PC. If I manually turn on the feature, I can install .APK files from anywhere I’d like. This has been a feature of Android since the beginning, and many people likely did this for the first time recently with Fortnite, which is not being distributed via the Play Store at all.

Why this is risky

On the one hand, this means that it allows people to install software that does not meet the standards set by Google for what should be allowed on Android. This can be good, but it can also lead to malicious software getting onto your device and doing things that are not acceptable to Google’s guidelines. This is inherent risk with any software, and things like the curated app stores of the modern era are guards against this danger.

This also means that piracy is more of an issue on Android, as people can more easily get free versions of paid apps if they would like. Developers can mitigate this, but it’s additional work and clearly does not stop this from being a thing1.

Why this is excellent

But then there’s the best part of this functionality: installing apps meant for one device (usually Pixels) and using it on whatever phone you happen to own. The most recent example of this is the new Google Camera app which introduced Night Sight. I have a OnePlus 6, which has a pretty good camera, but Google does not intend to give this camera app to me since I didn’t buy their phone. Fair enough, but the Android community got their hands on the APK for the Google Camera app with Night Sight and modified it to let it run on my phone.

And just like that, I’m using Google’s camera app with Night Sight and it works great on my OnePlus 6. And the best part: it works great! Google is using pretty standard hardware in their camera, so the magic of Night Sight is all done in software. So when I try to use Night Sight, I get results like this:

👇 Regular Mode

👇 Night Sight

The Night Sight feature works great and is proving to be a nice feature to have in a pinch. And it’s all possible because Android lets me go around the official app store if I’d like.

I don’t know if Apple will ever allow this on iOS, but it is certainly one of the things that I enjoy about Android whenever I am experimenting with the platform.

  1. No links from me, since I do not want to encourage this behavior.

Comparing HDR RAW capture on Lightroom, Halide, and Obscura

I love to shoot photos in RAW on my iPhone. The stock camera app does some excellent magic to get incredible photos, but in some cases I really just want to be old school and get a ton of image data and process things how I want.

One of the situations where I appreciate RAW the most is when taking photos with a large range of brightness in the shot. When shooting in the stock camera app, iOS will take a number of photos at different exposures and stitch them together to get an image it thinks looks good, and normally it does. But when you shoot RAW, you get a lot more data in the bright and dark sections of an image so you can boost of lower them without introducing grain or distortion to the image. Basically, you have control, not the computer. Maybe I’m old school, but I like that sometimes.

Below are 4 camera apps (stock Camera, Adoble Lightroom, Halide, and Obscurs 2) taking the same photo. It’s early morning, my living room has no lights on, and it’s bright outside. Let’s see how they all do.

Apple Camera.app

This was a big disappointment, as the iPhone 8 Plus really took a bad picture here. Even with HDR, everything outside the window is blown out. Meanwhile, the interior is quite grainy.

Again, the disadvantage of shooting in this format is that I’m basically stuck with what the camera spits out. It throws away a lot of image data to keep the file size smaller, so only minor edits can be made if I don’t like the decisions Apple’s app made.

So let’s shoot this in RAW and see what happens.

Adobe Lightroom

Lightroom is my go-to RAW camera app for iOS, in part because I pay for Creative Cloud and want to get my money’s worth, bot more so because I think it gets the best photos of any app I’ve tried before.

The difference between this image and what the stock camera app produced is night and day. This is a much more satisfying shot with little noise, good color, properly exposed highlights, and zero artifacts.

Also, and this will be true for the other 2 camera apps too, because this is RAW I can further modify this shot to be brighter, darker, have more/less contrast, and a whole host of other changes without degrading image quality. This simply isn’t as possible with a JPEG/HEIC image.


Similar to Lightroom, this image looks better than what the stock camera app produced, but not by as much. I got a lot of grain in the dark areas and wasn’t able to reduce the noise enough without also blurring out the whole photo. So better, but not quite as nice as Lightroom.

Obscura 2

This one was shockingly similar to the Halide image, and has basically the same result: better than Apple’s app, but not as clear as Lightroom.

Surprise! Pixel 2

What camera comparison on BirchTree would be complete without including the Pixel 2? I’m actually quite happy with this shot and think it’s a marked improvement over the iPhone’s stock camera. This is also using the Pixel’s stock camera app and this really shows off Google’s excellent HDR processing.

The outside is a little more blown out than I’d like, but it’s way better than the iPhone. Meanwhile, the interior is dark, but has less noise than you’d expect. This also shows off the notably colder color temperature the Pixel defaults to than the iPhone. iPhone shots tend to be warmer overall, while Pixel shots look cooler1.


The big takeaway for me is that I plan on happily taking RAW photos for the foreseeable future. The flexibility I get in editing images, especially tough HDR shots, is invaluable and is something I don’t want to lose.

For an even more explicit demo of how much data is lost when shooting in a non-RAW format, check out the video in this tweet.


  1. Temperature-wise, at least. 

Using an iPad for “Real Work.” No, Not Blogging, my Actual 9-5 Job.

Can I let you in on a little secret? When it comes to “iPads can do real work,” I’m putting my money where my mouth is. I have been using an iPad for 99% of my website and design work at home (including the redesign of this website a few months ago), I have been working on a Dell laptop at my 9-5 job. In the past week, that’s starting to change.

We have a 2017 iPad at the office that we use for testing1 and I decided a week ago that since I’m the only person who uses this device, I should just load my email, calendar, Slack, and a few other work apps on it. So for the last few days I’ve been using the iPad for most things.

The Good

Meetings are a million times better with the iPad than my Windows laptop. Instead of using OneNote or Evernote for my notes, I’m using Apple’s own Notes app and it’s going much better for me. Maybe it’s what I’m used to, but I feel like Notes introduces less friction than any of the apps I’ve tried on Windows. I would love to have a new iPad with Pencil support so I could draw inline in my notes, but this still works.

The reduced bulk is a game-changer. Part of this is because I have a 15” laptop, which is pretty substantial, but going down to a 10” screen and like 1/3 the weight is so liberating. As I move from meeting to meeting throughout the day, it’s great to carry something smaller.

Most of my workflow is just as good as it is on the desktop, and some elements are even better. Jira, for example, is a slow-as-molasses experience on even my quad core Kaby Lake laptop, but the in app experience is smooth and actually kind of enjoyable sometimes. Slack is also far better on iOS than the desktop, and so is most of my web browsing. Likewise, Office documents are just as easy to work with on iOS, and I even SSH into our dev server many times throughout the day to do things and it all works just was well as on iOS.

Email is more manageable for me. We use Gmail at work and I have to use it in the browser because I can’t find an email app for Windows that is both secure and worth a damn. On iOS I’m using the Gmail app, and it’s a better experience than the browser. I get about 50-100 emails per day and I feel like I have a better grasp on them on iOS.

Notifications crush Windows, which means I get the information I need right when I need it. Slack notifications are essentially broken on Windows (for me and everyone I know), and since more things are happening in native apps, I am able to control exactly what notifications hit me. Windows has notifications too, but the control just isn’t there.

Speed is an unexpected win as well. Despite being a $329 tablet, iOS feels faster for most tasks than my nearly $1,000 Windows laptop.

Unlocking is way easier since I have Touch ID and not a keyed password like I do on Windows. Like on all iOS devices (pre-iPhone X) I just press the home button for a moment and I’m logged in and ready to work.

The Bad

As much as wish they were, not all the apps I need are on iOS. Photoshop is the big one right now, as it’s something I spend an hour or two in everyday2.

Some apps are worse on iOS than Windows, which just makes me sad. Google Sheets is a good example, as it is much slower to use with a keyboard than it is on Windows. This is really the only service I can think of that’s notably worse than it is on the desktop, which is good, but it’s still a step in the wrong direction.

10” is a smaller canvas than is ideal. My Windows setup is a 15” laptop screen with two 27” monitors attached. Dropping down to 10” simply changes how I work. I’m pretty sure the 12.9” iPad Pro would make this better, but sometimes it’s nice to just be able to see things on a larger canvas, and outside of AirPlaying to a Mac or Apple TV, there’s no way to do this3.

I miss my clipboard manager. Only being able to copy and paste one thing at a time is a pain for my workflow. Sometimes I have times that I copy 3-5 things in succession and then paste them somewhere else. Or maybe I copied something yesterday and want to get that back, I can do this on Windows and macOS, but not iOS.


So far the experience has been very positive overall, but with a few minor issues. So far this is going better than I expected and I will be sure to report back later once I’ve been doing this for a few more weeks. Maybe there are things that are problems I just haven’t run into yet. There also could be more benefits that I’m not noticing yet, but will become more clear in time.

  1. Not a pro, mind you, just a $329 one with a simple Logitech keyboard case. 
  2. Which is why I was very happy to hear that Adobe is bringing a full version of Photoshop to iOS next year. 
  3. Not that AirPlaying is even the same thing. 

My Siri Shortcuts (in iOS 12 beta 1)

I think the most exciting thing Apple announced at WWDC last week was Siri Shortcuts. It looks to me like a brand new way to interact with our iOS devices and I am way into it. It’s as if Workflow and Launch Center Pro had a baby and Apple built that into the operating system. Apple is also aiming to make Shortcuts more than just the sum of its parts by adding some AI to figuring out when you want to do these shortcuts.

In the first iOS 12 beta, Shortcuts isn’t really implemented, but there are a few apps (mostly Apple apps) that will appear in the Siri settings already. This only scratches the surface of what will be possible, but I’m trying out a few shortcuts already and the benefit is pretty amazing.

Being able to say “scale image” in the middle of writing a blog post for BirchTree, select an image to scale to 1080p, and paste it straight into my blog post is amazing.

Saying “show me my mail” opens Mail to my universal inbox takes me to the top level of Mail, no matter where I left off last time I was using the app.

Saying “show me my stats” shows me BirchTree’s daily stats, saving me a few taps in Safari.

There’s not a ton here yet, but the first few simple shortcuts I have been able to make have made iOS feel more personal than ever. I’m really excited to see what else is possible when iOS 12 actually comes out this fall and app developers have built in the hooks needed to make more possible with Shortcuts.

Understanding Shortcuts in iOS 12

I’m seeing a bunch of people saying Shortcuts are a power user-only feature. “No normal person will ever use these!” In their WWDC keynote, I think Apple leaned too hard into what Shortcuts can do for power users, and people lost the message that these are something users will benefit from even if they do zero work to set them up. Let me explain.

Example App

First up, let’s take a look at Tweetbot, a Twitter app with a few main functions:

So when you launch Tweetbot, there are a few actions you can take: send a new tweet, tweet the last photo on your camera roll, or show you mentions (or activity). There’s more the app can do, but let’s focus on these 3 actions for now.

If I’m the app’s developer, I can assign “intents” to these actions. This is a code change only, and requires nothing from the user. Once this intent “tag” (for lack of a better word) is applied to the action of composing a tweet with the most recent photo attached, iOS 12 will keep track of how often, when, and where you tend to do this.

Basic Shortcuts (zero user investment)

If you tend to use the “post last photo” Tweetbot feature around noon everyday as you share your lunch on Twitter, iOS will start to show you a passive notification to do just that. This will either appear on your lock screen or when you pull down on the home screen to search. Here are some basic ones already working in the first iOS 12 beta:

For a lot of people, this will be how they interact with Shortcuts. Apps will just tag their functions as intents and iOS will try to show you what actions you want to take before you even have to ask. Again, all of this will happen without the user needing to do anything, they will just find they can do the things they do all the time with fewer taps.

Siri Shortcuts (mild user investment)

The next step up in complexity is making Siri trigger these intents inside apps. iOS 12 beta users can already test this a little bit by going to the Settings app and choosing “Siri & Search.” In here, you can set up Siri to trigger Shortcuts for Apple’s own apps. You take an action (intent) you’ve done recently, record a phrase1, and use that phrase to trigger that action/intent whenever you want.

In our example of Tweetbot tweeting my last photo, I could set something up that lets me say “Hey Siri, tweet my last photo” and it would open Tweetbot, start a new tweet, and automatically attach the most recent photo from my photo library. Then I don’t need iOS to guess that I want to do this at a specific time, I can just do it whenever I want.

And if going to the Siri app isn’t convenient enough, apps can build buttons into their interfaces that let you set up Siri Shortcuts for things as you’re doing them. Here’s a mock up for Tweetbot tweeting the last photo:

It’s not a great mock up, but the added Siri button above the keyboard could let the user set up a Siri Shortcut phrase right from inside the app they’re already using.

Shortcuts App (maximum user investment)

And then there is the Shortcuts app, which is the rebirth of the Workflow app that us nerds know and love. This new app is essentially an interface to create multi-part Shortcuts of your own. Maybe I like to include some stock text in the Tweet that’s sent to Tweetbot when I post that photo, so I could add some variables to generate that text and include it in the tweet in Tweetbot automatically. Maybe I want to send a text to someone at the same time or turn on my desk lamp, or some other madness at the same time…Shortcuts lets me combine these elements which could be Shortcuts all on their own into one action that runs at the same time.

We saw this on stage when they set up a Shortcut that would check their drive time to get home, text their spouse they were coming home, open navigation, start a podcast, turn on their lights, and set the thermostat to 70° all by saying “Hey Siri, I’m going home.” It was slick, but definitely something that will take some work to get right. Most people will not do this, but many will and they’ll get some serious benefit from it. I personally find it difficult to use non-iOS devices because Workflow has enabled me to do so much, but I know I’m a niche user.

The good news is that the library of automations that we have in Workflow today appears to be living on in the official Shortcuts app, and normal people will be able to use Shortcuts created by other users.


Shortcuts are a big part of iOS 12 and watchOS 5. Instead of opening apps and then doing things, Apple is trying to make it so people can more easily do the things they want to do without having to find an app icon first. Whether it’s seeing a shortcut you didn’t even set up yourself, asking Siri to do something you do all the time, or setting up complex automations, Shortcuts will be a big part of iOS and watchOS starting this fall for everybody.

If iOS is not good at predicting what users want to do, then yeah, Shortcuts will lean more heavily towards the power user who wants to do more with their devices. But if they do it right, this is going to be a reasonably large shift in how people use their devices going forward. Changing iOS from an app-centric platform to an action-centric one is a big change, but it could pay dividends for them.

Side note, consider the supposed home screen changes we’re expecting in iOS 13 next year. I would put a lot of money on shortcuts being a big piece of that redesign as well.

For more information about Shortcuts directly from Apple, check out these developer sessions from WWDC18:

  1. Or even a single word. 

Grading Apple on My watchOS 5 Suggestions

Apple doesn’t answer to me, so this is of no concern to them, but I wanted to take a look back at my watchOS requests to see how close Apple came to meeting them.

Siri Watch Face

I asked for: Let third party devs integrate to it.

Apple delivered: ✅ They did exactly this!

Watch Faces in General

I asked for: Always on faces and third party dev support.

Apple delivered: ❌ They did nothing at all here.


I asked for: Automatic workout detection, more winter activities, more badges, rest days, and sleep tracking.

Apple delivered: 🤷🏻‍♂️ They added automatic workout detection and some activities that weren’t shoveling, but time will tell if they do more badges. They definitely didn’t do any rest days for streaks or sleep tracking though.


I asked for: A Podcast app.

Apple delivered: ✅ A Podcast app. I need to test this to see how good it is, but they finally did it!

“Hey Siri”

I asked for: Don’t make me raise my wrist.

Apple delivered: 🤷🏻‍♂️ Don’t make me say “Hey Siri.”

Improve Dev Tools

I asked for: Let developers use better tools to make apps.

Apple delivered: 🤷🏻‍♂️ Hard to tell from the keynote, so the sessions should have better info in this regard.

Breathe app

I asked for: Make any change to this app.

Apple delivered: ❌ No update.

iPhone Battery

I asked for: Show me my phone’s battery.

Apple delivered: ❌ They will not show this.

Apple News

I asked for: Let me read full articles.

Apple delivered: ❌ No.

Theater Mode

I asked for: Also set me to DND.

Apple delivered: ❌ Nope.

The Dock

I asked for: Make this show more info at once.

Apple delivered: ❌ No.


I asked for: Third parties can set their own notification sounds.

Apple delivered: ❌ No.

Apple did not do well here, even though I think the watchOS 5 update is pretty decent. Podcasts and third parties on the Siri watch face are great additions, and the workout enhancements somewhat line up with my wants, but everything else was just “too bad, so sad.”

Grading Apple on My iOS 12 Suggestions

Apple doesn’t answer to me, so this is of no concern to them, but I wanted to take a look back at my iOS requests to see how close Apple came to meeting them.


I asked for: Performance improvements, especially on older devices.

Apple delivered: ✅ Exactly this! iOS 12 is notably faster on all devices and was the first new feature they talked about on stage.


I asked for: Better battery life.

Apple delivered: ❌ No mention of battery life at all.


I asked for: Grouped notifications, more compact display, and better controls from the notification itself.

Apple delivered: ✅ Basically exactly this! They grouped notifications on a per-app basis and they allowed for some settings to be changed (including blocking apps entirely) straight from the notification screen. I have my concerns with the specifics of this implementation, but it’s a great move.

Work and Home Apps

I asked for: The ability to mute apps when they are not relevant to me.

Apple delivered: ❌ Nope, all apps are treated the same.


I asked for: Siri in the cloud, so it works the same on all my devices.

Apple delivered: ❌ Not this at all. They enhanced Siri a bit, but not like this.

iCloud and Backups

I asked for:  Better iCloud behavior when it comes to device backups.

Apple delivered: ❌ You’ll need to upgrade your storage to back up a single iPhone and you’ll like it.


I asked for: More power, especially when it comes to external storage.

Apple delivered:  ❌ Literally not a single mention of Files.app.

Picture-in-Picture on the iPhone

I asked for: Do it.

Apple delivered:  ❌ They didn’t.


I asked for: Better suggestions and RCS support.

Apple delivered:  ❌ They did neither. I even checked the expanded release notes to see if RCS was there, but to no avail.


I asked for: Support for OLED features such as always on screens and dark UIs.

Apple delivered:  ❌ Nope, nothing. This was a big let down for me since they could totally do this and everyone would be happy, but they put their energy elsewhere this year.

Stay tuned for a piece on how they did compared to my watchOS 5 proposal.

Work Apps in iOS 12

I failed to mention this in any of my notification articles yet, and that’s a damn shame because I think this would be a great feature. Without further ado, here’s what I want.

Apps should be able to be marked as “work” apps. This would simply mean they are an app that is useful to you when you are at work, but either useless or annoying when you are out of the office. As a real world example, I have Jira notifications turned on so that I can keep a pulse on what is happening with my board at work. This is great when I’m in the office and keeping track of things, but it was annoying as hell as I was driving down the interstate today and these notifications came pouring in even though I didn’t want to think about work.

The current solution to the above problem is currently to either go to the settings app and turn off notifications, only to turn them back on again tomorrow morning when I’m back at the office. That’s a pain and not worth the effort. The other solution would be to turn on Do Not Disturb, but that wasn’t ideal either because I wanted other notifications to come through, just not my work ones.

By marking apps as work apps, it would mean the phone would only give me these notifications when I’m working, and not when I’m out of work mode.

Because working is different for everyone, I’d suggest that once an app is tagged as a “work app” then they would choose what it means for them to be at work. They could say “only show me this app’s notifications when I’m at this location” or “only show thee notifications on weekdays between 9am and 5pm.” Or even a combo of the two, “only show these if it’s between 9am and 5pm, and I’m at work.”

I’m totally open to whatever UI works best for this, but being able to quarantine my work apps on my phone a little bit would be hugely helpful for me, and I bet a lot of other people who are trying to improve their work-life balance.