Recorder on the Pixel 4 vs Dictation on iOS

I read this sequence from a recent blog post on this site and had the just released Dictation for iOS and Recorder for Pixel 4 listening at the same time. Here are the exact words I read:

It’s the end of the decade and I love lists, so today I’m starting a series of lists about my favorite things. Today we’re looking at my favorite albums from the past 10 years. What I learned more than anything with making this list is that 2010 and 2011 were more amazing than I gave them credit for at the time. 7 of the 10 albums below came out in those two years! Alway, I hope you like the list and check out one or two of these that you haven’t listened to yet.

And here is what Google Recorder captured:

It's the end of the decade and I love lists so today. I'm starting a series of lists about my favorite things. Today we're looking at my favorite albums from the past 10 years. What I learned more than anything making this list is that 2010 and 2011 were more amazing than I gave them credit for at the time. Seven of the 10 albums below came out in those two years. Anyway, I hope you like the list and check out one or two of these you haven't listened to yet.

And Dictation for iOS:

It's the end of the decade and I love lists so today I'm starting a series of less about my favorite things today we're looking at my favorite albums from the past 10 years what I learned more than anything making this list is that 20 10 and 20 11 were more amazing than I gave them credit for at the time seven of the 10 albums below came out in those two years anyway I hope you like the list and check out one or two of these two haven't listened to yet.

Both apps captured the general message well, but Google is so far ahead here it’s not even funny. I counted 5 errors in Dictation’s transcript and zero in the Recorder transcript.

Not only are the words more accurate in that transcript, it also includes punctuation that’s not too far off from the script. Dictation can only give me one loooong sentence, but Google mostly understood where each sentence ended.

I’m glad Dictation is out on the iPhone because the functionality is very nice, but Google should rightly be proud of what they have been able to do with voice detection on the Pixel 4.

When it Comes to Performance, iPhone is Still King

Finding things you can test on iPhones and Android devices is kinda tricky. App launch times, while relevant to overall sense of speed, does not really test performance, and many of the apps that I could use to test an iPhone’s performance don’t run on Android.

Given this, there are 6 total tests I felt I could run that were a fair comparison of speed. I broke the tests in to 3 segments:

Segment 1: I ran 3 browser benchmarks back-to-back, and then immediately ran Geekbench 5’s CPU benchmark. This was in an effort to get a sustained load on the phones and see which held up better.

  1. Run JetStream 2
  2. Run MotionMark
  3. Run Speedometer
  4. Run Geekbench 5 CPU benchmark

Segment 2: Take 10 HDR photos in Adobe Lightroom, add a preset profile to each one, and export the photos to the camera roll. I only timed the export part.

Segment 3: Export a 31 second video in Adobe Rush using all of the sample clips included in the app when you first download it.

Segment 1: Benchmark apps

In all cases, the iPhone 11 Pro demolished the Pixel 4. I’ll also add that the Pixel got noticeably warm by the third run, while the iPhone never felt like it was breaking a sweat.

Also relevant info, I let both phones sit for a while after this test run and then tried the Geekbench test again. I wondered how much better they would be if they were running that benchmark from a cool state. The iPhone was less than 1% better, and basically within the margin of error, while the Pixel was 21% faster in single core and 10% faster in milticore.

Segment 2: Lightroom Export

The process for this was:

  1. Take 10 photos in Lightroom using the HDR camera mode
  2. Let those process and apply one of my presets to it, which will apply about 15 sliders to each photo
  3. Export to the camera roll

The export was 2x as fast on the iPhone, and the processing of the HDR photos was at least 2x as fast, but I didn’t time that so I’m not sure of the exact number.

Segment 3: Adobe Rush Export

And finally, I installed Adobe Rush, selected all 5 of the sample videos and exported the resulting 31 second video. Again, the iPhone destroyed in this test, exporting 3x faster than the Pixel.


This is nothing we already know, the iPhone continues to have a massive lead in CPU and GPU performance over the competition. And while tests like this can be dismissed as not representative of real world use, I think the differences can be seen throughout the phone experience.

Take night mode on each phone as an example. These phones take very comparable shots, but the Pixel requires you to hold the camera still for 2-10x longer than the iPhone. Similarly, the iPhone previews portrait effects while you align your shot and the photo is ready immediately. Meanwhile, the Pixel can’t preview the effect in real time, and after you take the shot it requires a good3-5 seconds before the effect is rendered. And finally, there’s video where the iPhone can do 4k HDR with 120 samples per second at 60fps, all while the Pixel caps out at 4k30 and doesn’t get HDR in that mode.

Obviously, many of the things we do on our phones don’t take advantage of all the power we have today, so many things are just as fast on either phone, but if you’re looking for a device that will last you years and still feel good, then the more headroom you can have on day one, the better.

Deep Fusion is Legit

Apple released iOS 13.2 to the world today and it includes a great new bit of tech that will make your low light photos look even better (as long as you have an iPhone 11 or 11 Pro). It's called "Deep Fusion" and while you won't see this as a new mode in the Camera app, you'll start to get better looking photos in medium to low light.

As an example, take a look at the photo at the top of this post; that was taken with iOS 13.2 and Deep Fusion. It might not look wild to you yet, but here's what a 100% crop on my precious, in-between-the-pillows-sleeping dog's fur without Deep Fusion (shot in RAW with Halide):

And here's that same shot moments later with the Camera app and Deep Fusion working its magic:

And because I know you want to know, here's the Pixel 4 doing the same shot:

The difference is pretty remarkable and even the mighty Pixel 4 can't keep up with this level of detail (although it does fine in its own right). It's a relatively subtle update that you might even notice all the time, but when you do, it's pretty remarkable.

Oh, another reason you might not notice it is because it's probably off for you! Go to the Settings app and then camera settings. Turn off the "capture outside of frame" option for still photos and then this will work. Here's to hoping next year they don't make you choose one of these features or the other.

Computational Zoom on the iPhone 11 Pro

Google talks a big game about how they enhance zoomed photos in a feature they call Super Res Zoom, which I have tested in the past and found to be less effective than the additional hardware found in the iPhone XS. The Pixel 4 will be in my hands soon and I'll be testing out zoom there, but in the mean time I got a question on Twitter I wanted to check out.

Michael Stanclift

So I was always taught not to use any digital zoom but crop it later, is that not good?

Well, Google says they're doing magic to make digital zooms better, so let's see if Apple is doing the same thing. Here's the picture from the top of this post zoomed waaaaay in on the one way sign:

The zooms on the ultra-wide and normal lenses are hilariously bad, but it shows how much benefit the 2x telephoto lens offers. but comparing the 2x optical zoom to the 5x digital zoom doesn't show much difference to my eyes. It looks like the 5x photo is softened a bit to reduce the hard edges, but it's not adding any actual details.

And here's another example, this time going from 1x, 2x, 5x, and then 10x.

And here are the crops in on the center of each photo:

Again, we see vast improvements over the 1x lens, but the 2x, 5x, and 10x zooms all look really, really similar. The 5x one looks a little fuzzy, but I'll chock that up to minor focus differences between shots, as the 2x and 10x photos are more clear.


It appears on the iPhone that the digital zoom does not do anything to add details that would not have gotten in the standard 2x optical zoom. It softens the image a bit to make it look better, but that's not actually adding anything and just makes the full 12MP image it spits out a little less harsh.

So if you need to zoom in on something, feel free to stick to the 2x zoom when taking the photo and cropping in later, you're not going to get better results by digitally zooming in more.

I Want An “Exciting” Phone: or An Appreciation for Iteration

This is the Estwing 16 oz. Cureved-Claw Rip Hammer:

Doesn’t look that crazy right? Surely this is just a random hammer I found at Home Depot, right? Nope, according to The Wirecutter, this is the best hammer money can buy.

There are plenty of other hammers you can buy that are bigger, have crazy materials, and more interesting designs. These exist and that’s fine, but if you’re just looking for the best tool for the job, this “boring” hammer is where The Wirecutter says you should spend your money.

I think about stuff like this when I see reviews for the new iPhone 11 and 11 Pro. These phones took everything good about the iPhone XR and XS from last year and…

  1. Added lots more battery
  2. Improved the camera system a ton
  3. Improved the build quality
  4. Shipped the fasted chip in any smartphone (or most PCs)
  5. Added WiFi 6 and a billion other wireless bands that help people today

And of course these phones run the mobile operating system that has the best library of first and third party apps that will actually take advantage of these new features.

So what they did was take a very good phone in the iPhone XS and made everything about it better in meaningful ways people will appreciate every day they use these phones.

And yet, many reviews have a tinge of boredom to them. Snazzy Labs said he was almost embarrassed to say he loved the phone and thought it looked like it came out in another decade from the just released Note 10+, which is patently insane because this is what an iPhone from another decade looks like:

But I digress. I think that it totally makes sense that enthusiasts like Snazzy Labs, me, and probably you (yeah, you) to get excited about new hardware and unique devices, but when we review products and want to even pretend that we’re talking to a mainstream audience, I think a lot of us come up short. We talk about these things like enthusiasts, not as people trying to help people make informed buying decisions.

Is any real person besides the jackyls in the YouTube comments section going to pick one phone over another because one has a 30% smaller notch? Hell no! Is someone going to buy the OnePlus 7 Pro over the iPhone 11 Pro because the screen-to-body ratio is higher on the OnePlus? Nope.

Smartphones are a hobby to us, but they are a tool to most people. A tool they’re proud of and a tool that they would like to look nice as well, but a tool nonetheless. And so when Apple takes a very good phone and makes everything about it better while adding features that will specifically make its users lives better every single day, all while staying up to the bleeding edge with 90% of what other phone makers offer, and while maintaining its huge lead in first and third party software quality…well that’s a pretty good update in most people’s eyes.

I’m sure the 2020 iPhone will indeed have a new design and a whole host of new goodies, but there’s something to be said for the last version of an iPhone design style. The iPhone 11 and 11 Pro are as good as these designs could really be and there’s value in that. Just because it’s not the flashiest phone in the world (although I’d contend this thing is gorgeous) doesn’t mean it’s not also the best phone for many people and you certainly don’t need to feel guilty or embarrassed for liking it.

My Quick iOS 13.2 Video Options UI Revamp

iOS 13.2 beta 2 came out today for developers and Apple finally added the ability to change the video settings for resolution and frame rate from within the Camera app itself. This is great! We’ve been asking for this for years and it’s wonderful that it’s coming right around the corner.

However, the UI for this feature is extremely minimal and not discoverable at all. Also, once you know how it works it’s still hard to use because the touch targets are quite small. At the top of this post is my minimal effort UI change I’d suggest they make before releasing this to the public.

I’m not arguing this is gorgeous or anything, but if Apple intends for these buttons to actually be used then they should make these look like the buttons they are. Maybe Apple’s product team believes these should be there and those who know about the controls (probably nerdier users) can use them, but most people will never use them. If that’s the case, then I’ll give it to them that their design looks nicer. It’s all about priorities and what problems they’re trying to solve.

Another Look at Deep Fusion

I happened to be wearing a sweater yesterday so I decided to take a selfie in low light and see if there were any differences.

Here's another couple shots where I got Deep Fusion to turn on and off.

Long story short, I noticed a few differences, but you really need to look close to see the improvements. I think I need to play around more in even lower light to try and eek out a little more of Deep Fusion's power.

The Best iPhone and Case Combo

It goes without saying you can choose whatever you’d like, and this is all subjective, but the combination of the gold iPhone 11 Pro and the green Apple leather case is the hottest look I’ve ever seen in a smartphone.

The green and gold look not only makes the Packers fan in me happy, but is just a damn classy look.