Above is a photo of my desk setup at home. It’s a 13″ 2015 MacBook Pro attached to a 1TB external hard drive. That’s it! No external monitor, laptop stand, keyboard, or mouse. This is what my work space looks like basically all the time.
Meanwhile, at my office at work I have a nice Windows laptop, a 28″ 4k external monitor, and a keyboard and mouse. I can basically arrange my setup however I would like, I just must have the Windows laptop the company provides (it’s a nice one, so I’m not going to complain). The major difference between my work and home setups, besides the Mac vs PC, is that I have a mouse and keyboard hooked up at almost all times.
But why? At home I do video editing in Final Cut Pro X, audio editing in Logic Pro X, photo editing in Lightroom and Photoshop, create mockups in Sketch, and I do web development in a slew of other apps. At work I mostly use Chrome, spreadsheets, and do some work in the command line. Why on earth do I feel like I need to use an external mouse when a trackpad is built right into the laptop, and that works great for me for all those more input-intensive tasks on my Mac?
The answer is simply that Windows trackpad support is still light years behind macOS, and there are a couple specific reason why:
I have a Dell at work, and the trackpad could best be described as adequate. It’s fine, and it’s certainly better than what most Windows machines had 10 years ago, but it still feels like a secondary input method to me. It feels like something they put on there for you to use in a pinch, but you’ll usually use a mouse.
Even on laptops with better trackpads, like the Surface line, the OS doesn’t play as nicely with touch input as macOS. Momentum is different (and sometimes just absent) and there is a delay between my fingers moving on the trackpad and the UI on screen updating.
Mac users like me always have a hard time figuring this out. Macs had better trackpad responsiveness a decade ago.
2. Multitouch gestures
A big reason I can get more done on my Mac with a trackpad is the gesture support is simply far beyond what is going on with Windows. Using 3 finger swipes to switch between desktops is way faster than using 2-handed keyboard shortcuts on Windows, swiping three fingers up or down to show certain collections of windows is lovely as well. Using 2 fingers can scroll or navigate forward and backwards in just about any document or browser, and pinching 2 fingers can zoom or scale just about anything in the system to your desired size.
There’s more, but in short, macOS’s gestures feel like natural parts of the user interface, not like something that was tacked onto the system.
3. App support
All of this would be worthless without app support, and thankfully Apple and third party developers do a generally great job of making their apps work with trackpads in mind.
As I said at the top, I use a trackpad to do video editing. That may seem like an absurd statement to some video editors out there, but for me using a trackpad feels more natural than using a mouse with Final Cut Pro. Sure, a mouse works well too, but I prefer the trackpad for being able to pinch and zoom in the timeline, which I happen to do all the time.
Other apps take advantage of the trackpad too. Tweetbot, Ulysses, Safari, Chrome, Logic, Mail, and many more apps let you navigate their UIs with multi-finger swipes. It’s usually easy to discover this functionality as well, as certain actions (especially two finger side swipes) have become pretty standard.
People will disagree with this assessment, and that’s fine, everyone is entitled to their opinion. Others will tell me Windows’ touch screen support is more important than touchpad support, but based on the untouched touchscreens in my office, it doesn’t seem like that’s been a valuable upgrade to our laptops/convertibles. But I feel strongly that there is something deeply flawed with touchpads on Windows, and it doesn’t seem like it’s ever going to catch up.
The Samsung Galaxy S8 and LG G6 just came out, which of course means the camera comparisons are hitting the web faster than you can check them out. So why not jump on this train and do my own camera comparison?
Here’s 2 photos I took today, and I want you to decide which ones you prefer.
In this case, we have 2 very nice shots. Both images are in sharp focus, and the background has a nice blur to it, which helps the flower really pop. The image on the left is definitely more saturated, and the contrast looks higher. This makes the image pop a little more, and is a better image in my eyes. You may prefer the righthand image through, and I wouldn’t blame you. It’s more subdued, and a more level image, but it’s just as sharp.
Here’s the other shot:
We have a very similar situation here, with one image with a much flatter look, and the other with a lot more contrast and saturation. I think the top image is a lot more realistic, but the bottom one sure looks more exciting. Again, they’re both good, but you probably prefer one over the other.
Okay, so which do you like better? Ultimately it doesn’t really matter, because they were all taken with the same phone. My iPhone 7 Plus took all of these shots.
The less contrasty, flatter images are the exact image that came out of Apple’s built-in Camera app, and the more colorful ones were taken with Adobe’s Lightroom app. The difference really comes from the fact that I took the RAW files from Lightroom and edited them to punch up the colors, contrast, and clarity a bit.
So essentially what we’re looking at here is a difference in post-processing. Apple’s camera app is taking the same data I got from the RAW image, and they do some manipulations to the image to make as pleasing an image as possible. Meanwhile, the RAW image I get from Lightroom lets me set everything how I like it.
Because I don’t think most people appreciate what a RAW image from a smartphone looks like, here’s what the Camera app and I (with Lightroom) started working with:
Now I don’t think any of use would say either of those images are really what you’d like to share on social media, so it’s a really good thing that Apple and I did some processing on them to make them look reasonable. I mean, look at these side by side (edited on left, RAW image on right):
The RAW images that come out of the camera sensor are not ready for prime time, and it’s the job of software to translate that image to something that looks like reality. To Apple’s credit, both of the images that I took straight from the Camera app look great. They are incredibly accurate representations of what these things looked like in real life. Apple’s decisions around image processing appear to be focused on recreating reality as closely as possible, and they do a really good job of it. I personally like my images to pop a little more, and to evoke a heightened reality. I want them to reflect what my brain remembers that flower, or that tree looking like. So I move a few dials back and forth until I get an image I’m happy with. Based on the number of people who use filters on their images on social media, I don’t think I’m alone.
Additionally, I edited a RAW file using Lightroom because RAW has more data, which essentially means I can make bigger changes to the file without things getting all wibbly wobbly. For some context, the RAW file Lightroom produces is 32.7MB, while the JPG image the Camera app exports is 4.1MB. That’s a lot more data to manipulate.
That said, the fact that the Camera app outputs a relatively neutral image that doesn’t make too many stylistic choices for you, means that you have more flexibility in making your own changes. This means that you can add contrast, remove saturation, or simply apply an Instagram filter and the result is likely going to look really good.
Ultimately, the question you need to answer is whether you want your smartphone maker to make these decisions for you. I personally like the freedom the iPhone camera gives me to take photos I know will look good, and realistic, but I can then take and stylize with ease. The Samsung Galaxy lines of phones have long gotten high marks from camera comparisons in large part because people like the over-saturation their camera app uses on images. As you can hopefully see now, it’s not really a matter of the Galaxy having a better lens than any other phone, they just made different choices in post-processing. This also explains why you sometimes see different phones with the same lens in them produce different looking images. The companies using those lenses made different post-processing choices.
The next time you watch a camera comparison, watch it knowing that anything having to do with color is mostly down to decisions manufacturers made in regards to post-processing rather than a difference in lens quality. Things like noise, poor focus, and blurry action shots (due to slower shutter speeds) are better comparisons between cameras. Honestly, it’s probably best to just pick the camera that has the fewest straight up bad photos. You can almost always make a good photo great with a tiny tweak here or there, but it’s much harder to take bad photos and make them good.
Two years ago today, I was sitting at home, waiting for UPS to deliver the device I had spent years waiting for Apple to make. After years of using a Pebble and being very impressed with Apple’s September 2014 demo of the Apple Watch, I was amped up for this. It was a Friday, and I took the day off of work so I could be at home all day to make sure I didn’t miss the delivery. UPS’s website assured me it was “Out for delivery” all day, but the hours ticked by, my Twitter feed filling up with people unboxing their Apple Watches. I was growing a little worried.
But finally at around 6PM I saw the UPS truck pull up, and I knew it was time! The UPS guy handed me a peculiarly-shaped box and asked me “is this the Apple Watch?” He told me he’d delivered about 50 of these on his route that day.
At 6:28PM I took my first glee-filled picture of my new toy:
And it didn’t stop there:
After 2 weeks with my new toy, I published my review, and I can say with confidence it’s the best review for any hardware product I’ve ever written. Frankly, I think it’s the best review I’ve seen of the original Apple Watch.
“It’s nice, but what do you do with it?” This is by far the most common question I get from people. It’s a hard question to answer, and it does make me sympathize with Apple’s struggles in marketing the Watch. It’s not just a couple things that it does incredibly well and legitimize the $350+ expense. There are dozens of little things that add up to an overall experience that I find incredibly useful. Because of this, the Apple Watch is impossible to properly demo in 30 seconds, so I’m going to cover what I’m using my Watch for thus far.
And on using Apple Pay:
Apple Pay with the Watch is a delight. If Apple Pay on the iPhone is 50% faster than using a credit card, then Apple Pay on the Watch is another 50% faster than the phone. It’s obscenely easy to do, and is one of the things on the Watch that really makes me feel like I’m living in the future.
Commenting on battery life:
The lowest I ever got the battery was 16%, and that was on a day where I woke up at 3AM and was awake until midnight. That’s 21 hours of use with plenty of juice left. Because the battery life has been so good, I haven’t even used the power reserve mode on the device.
To this day, I still have never used power reserve mode unless I wore my watch for more than 24 hours straight.
by the dollars-to-value metric, the Apple Watch is probably a bad buy right now. I don’t think it’s a bad product by any means; far from it. I hope you can tell from my review that I love my Apple Watch and I don’t want to go back to a world without it. It absolutely makes my life a little better and easier in many specific ways. It was worth the $400 for me, but that’s far from an impulse purchase for most people. If you read this review and saw some use cases you find appealing and you have $400 to spend on a luxury like this, then I think you should go for it. You won’t be disappointed.
My overall impression was that it was a good product, but at $400 for the 42mm model, you really needed to have a good reason to get one of these.
I have continued to write extensively about the Apple Watch and here’s a couple choice samples.
First, did you know that getting podcasts onto the Apple Watch has sucked, really sucked, forever? Marco Arment just fixed this a couple days ago as Overcast became the first podcast app that lets you load podcasts directly onto the Apple Watch. From my complain-fest in May 2015:
My 2 episodes amounted to just over 50MB, but they took almost 15 full minutes to sync.
I made a mock up of what I thought we might see from 3rd party app complications in watchOS 2. I was a little over-enthusiastic about what apps would have the ability to do…
Running with Apple Watch is a piece that brought out all of the “serious runners” who were very grumpy that the Apple Watch was not suitable for run tracking. I understand the complaints (mostly that cell phone GPS isn’t accurate enough), but respectfully disagree.
The Apple Watch is my favorite all around fitness device I have ever owned. I absolutely love that my regular watch is also my step counter as well as my workout tracker. That said, I do think that there is a lot of room for the device to grow as a workout companion. The Watch falls short specifically on long term goal planning, which can be just as, if not more important than your single workout stats. I also find it shocking that your workout history is not saved to iCloud, but only your local device.
I’ve fallen in love with the red, green, and blue activity rings introduced in the first Apple Watch. It’s the best system I’ve found for making sure I am as active as I need to be. It’s also the first system I’ve seen where your health goals are not based on steps. It’s all about calories burned, how long you’re active, and getting off your butt to move around.
I wear my Apple Watch every day, and these three rings are a big reason I do. They’ve spoiled me for other fitness trackers like the Fitbit, which I used up until I got the Apple Watch. Judging my fitness progress based on steps seems so archaic now that I can’t see myself ever using a fitness tracker that just uses that as a metric.
In March 2016 I wrote a love letter to my Apple Watch that got a surprising amount of play in the blogosphere.
There is no shortage of blog posts and podcast episodes with people discussing at length why they stopped wearing their Apple Watch or what drastic changes they think Apple needs to do to make the Apple Watch the least bit useful. I won’t suggest that the Apple Watch is perfect or that it doesn’t have big steps it can take to improve, but I will say that I love my Apple Watch and have not gone a single day since April 24, 2015 without wearing it. It is an absolutely essential part of my life and is something I never want to be without in the future. If Apple threw in the towel today and said they’re never making another Apple Watch, I’d keep wearing this one until it stopped working.
I had the only major full-length review of watchOS 3 on launch day back in September 2016. It was my most popular post of 2016, and again, I would argue it was the best review of watchOS 3 on the web.
This is an update that feels almost as much like a hardware upgrade as it is a software update. Everything is faster, and the software is more capable. I certainly have complaints with how far Apple went with some features and wish they had moved forward a little more, but I don’t think there are any places in this software update that I feel Apple has taken a step back. This isn’t just a minor update that you can wait on upgrading to, it’s an update you want to rip your watch off for, run to your charger, and patiently wait for this to be installed on your watch. You are about to start loving your Apple Watch a whole lot more.
And finally, I have not been hesitant to give Apple (I hope you’re reading!) specific suggestions about how to improve their software on the Apple Watch. In February 2016 I made a pitch for watchOS 3, and I followed it up in January 2017 with another pitch for what they should do with watchOS 4.
The Apple Watch has proven to be an essential part of my life, and it’s not something I see myself giving up any time soon. The Apple Watch has problems, it absolutely does, but none of those problems keep me from wearing my Watch everyday. I feel naked when I’m not wearing it, and it feels downright barbaric to look at my phone ever time a notification comes in to see what’s going on in the world. Long live the Apple Watch!
Popular podcast app Overcast was updated yesterday and it was the most exciting update I’ve seen in a long time to a podcast app. This update added the ability for users to load podcast episodes directly onto the Apple Watch. This is a big deal, and is something I’ve been waiting years for at this point. This means I can finally bring my Apple Watch with me on runs, and keep my iPhone at home!
Adding episodes is relatively simple, as you just tap the
⁺≡ queue button and tap “Send to Watch”.
Once an episode is loaded onto the Apple Watch, it can then be played directly from the Watch. The Overcast app essentially has 2 screens now: one for playing back from your phone, and one for playing back from your watch. Note that playing from the watch itself means you need to be using Bluetooth headphones and they need to be connected to your Apple Watch, and not your iPhone.
Playback from the Apple Watch is solid, and easily the best experience I’ve had with any solution that plays podcasts directly on the Apple Watch. Overcast’s advanced audio features like smart speed and voice boost do not seem to be possible on the Apple Watch, so playback is a little less silky smooth than it is on the phone, but it’s basically in line with apps like Castro and Apple’s own Podcasts app that don’t do as much to smooth out audio at more than 1x speed. That said, the fact that the Watch app does play back episodes at my desired speed is wonderful, and again, not something other apps that have done this have allowed.
The biggest issue with this feature is that it takes too long to load episodes onto the Watch. The transfer seems to be done over the Bluetooth connection to the Watch, and it is just a slooooow process. On short episodes it is not an issue. I loaded this morning’s episode of The Daily in about 30 seconds, but longer podcasts, especially those that get over the 1 hour mark, are more laborious. It took 7 minutes to transfer the latest Ctrl-Walt-Delete episode (57 minute episode).
I’d love to see Overcast find a faster way to load episodes onto the Apple Watch, and I would love for Apple to give apps like Overcast the ability to tap into SiriKit so I could more easily control the Watch app with my voice. Even with those limitations, I’m quite happy with this initial implementation of this feature, and hope to see it continue to improve.
This Slack backend means that your users get a nice chat interface on your website, but you get all the benefits of Slack on your end. It’s really nice, and while it’s not something I personally need for my website, I almost certainly would use this if I ever do.