Mastodon

Purposely Breaking the iPhone 7 Plus's Portrait Mode

Posted by Matt Birchler
β€” 3 min read

Apple shipped the iOS 10.1 beta this week with it's much anticipated "portrait mode" on the iPhone 7 Plus. Lots of people are sharing the beautiful pictures this feature can create. Here are a few of my favorites:

https://twitter.com/oliverames/status/779689241999802368

https://twitter.com/imyke/status/779484838910910464

https://twitter.com/iamkory/status/779482406738857984

So I wondered what the limits of this feature was, as Apple seems to have designed it for shots like the ones above. What if you took shots it wasn't really designed for? To find out, I took a few shots during my run this morning. I took very little time to compose anything and have not done any post-processing to these, they're exactly as they came out of the camera app.

Side note: I don't know why literally every time I get a new phone the weather turns to shit. It's a cloudy, rainy day here in Illinois, so none of these shots are particularly photogenic.

First up is a simple street sign. This doesn't look terrible, but it does look a little artificial. It got the depth right, but the straight lines make the halo effect around the sign extremely pronounced. This shot is definitely worse than the standard shot you would get without the blur.

Next up were some flowers, which turned out pretty good. I took this one in haste and could have focused better, but this shot shows that the depth effect doesn't just blur everything in the background. You can clearly see levels of blur on different parts of the plant.

The orange flowers in front are the subject, while the the other orange flowers are slightly blurred, and the purple ones in back are considerably more blurred.

I dared to try a selfie mid-run, and this is easily the weirdest shot I got. I have to mention this was mid-stride and I hadn't shaved yet for the day. I'm a scrub, but stick with me here.

Portrait mode did a great job of separating me from the background, as the hard halo effect we saw on the street sign above is not an issue. The blur level is much higher than I would want, but I think this has to do with how close I am to the camera. Apple didn't tell us to take selfies in this mode, and I don't think I ever will again.

The real problem, and I'm sure you noticed immediately, is that my headphones are too thin to register and get lost in the background. I almost look like I'm wearing AirPods with EarPods thrown over my shoulder just in case. Obviously this is one of those cases where fake depth of field is not as good as the real deal; a nice camera would be able to blur the background while knowing that my headphones are a part of the foreground.

Finally we have a complex object that really puts the depth perception to the test. This picture looked better than expected on the phone, but it's definitely artificial-looking when you blow it up on a larger screen.

The real test on this one was to see how it handles an object where the background is around the object, but also mixed in the middle of it as well. Let's zoom in to 100% to see how it looks up close.

It's very interesting (and impressive) that it actually understands that much of what is in the body of the tree is actually behind it, and tries to blur those sections. It's not 100%, but it's doing better than I expected.


Despite pushing the mode beyond what it's meant for in this beta period, I think this is a game changer for mobile photography. The shots people are already getting are absolutely stunning, and separates the iPhone from the rest of the smartphone industry. Everyone else can fight over having the most megapixels, but this changes the entire conversation.