Understanding Shortcuts in iOS 12

I’m seeing a bunch of people saying Shortcuts are a power user-only feature. “No normal person will ever use these!” In their WWDC keynote, I think Apple leaned too hard into what Shortcuts can do for power users, and people lost the message that these are something users will benefit from even if they do zero work to set them up. Let me explain.

Example App

First up, let’s take a look at Tweetbot, a Twitter app with a few main functions:

So when you launch Tweetbot, there are a few actions you can take: send a new tweet, tweet the last photo on your camera roll, or show you mentions (or activity). There’s more the app can do, but let’s focus on these 3 actions for now.

If I’m the app’s developer, I can assign “intents” to these actions. This is a code change only, and requires nothing from the user. Once this intent “tag” (for lack of a better word) is applied to the action of composing a tweet with the most recent photo attached, iOS 12 will keep track of how often, when, and where you tend to do this.

Basic Shortcuts (zero user investment)

If you tend to use the “post last photo” Tweetbot feature around noon everyday as you share your lunch on Twitter, iOS will start to show you a passive notification to do just that. This will either appear on your lock screen or when you pull down on the home screen to search. Here are some basic ones already working in the first iOS 12 beta:

For a lot of people, this will be how they interact with Shortcuts. Apps will just tag their functions as intents and iOS will try to show you what actions you want to take before you even have to ask. Again, all of this will happen without the user needing to do anything, they will just find they can do the things they do all the time with fewer taps.

Siri Shortcuts (mild user investment)

The next step up in complexity is making Siri trigger these intents inside apps. iOS 12 beta users can already test this a little bit by going to the Settings app and choosing “Siri & Search.” In here, you can set up Siri to trigger Shortcuts for Apple’s own apps. You take an action (intent) you’ve done recently, record a phrase1, and use that phrase to trigger that action/intent whenever you want.

In our example of Tweetbot tweeting my last photo, I could set something up that lets me say “Hey Siri, tweet my last photo” and it would open Tweetbot, start a new tweet, and automatically attach the most recent photo from my photo library. Then I don’t need iOS to guess that I want to do this at a specific time, I can just do it whenever I want.

And if going to the Siri app isn’t convenient enough, apps can build buttons into their interfaces that let you set up Siri Shortcuts for things as you’re doing them. Here’s a mock up for Tweetbot tweeting the last photo:

It’s not a great mock up, but the added Siri button above the keyboard could let the user set up a Siri Shortcut phrase right from inside the app they’re already using.

Shortcuts App (maximum user investment)

And then there is the Shortcuts app, which is the rebirth of the Workflow app that us nerds know and love. This new app is essentially an interface to create multi-part Shortcuts of your own. Maybe I like to include some stock text in the Tweet that’s sent to Tweetbot when I post that photo, so I could add some variables to generate that text and include it in the tweet in Tweetbot automatically. Maybe I want to send a text to someone at the same time or turn on my desk lamp, or some other madness at the same time…Shortcuts lets me combine these elements which could be Shortcuts all on their own into one action that runs at the same time.

We saw this on stage when they set up a Shortcut that would check their drive time to get home, text their spouse they were coming home, open navigation, start a podcast, turn on their lights, and set the thermostat to 70° all by saying “Hey Siri, I’m going home.” It was slick, but definitely something that will take some work to get right. Most people will not do this, but many will and they’ll get some serious benefit from it. I personally find it difficult to use non-iOS devices because Workflow has enabled me to do so much, but I know I’m a niche user.

The good news is that the library of automations that we have in Workflow today appears to be living on in the official Shortcuts app, and normal people will be able to use Shortcuts created by other users.

Takeaway

Shortcuts are a big part of iOS 12 and watchOS 5. Instead of opening apps and then doing things, Apple is trying to make it so people can more easily do the things they want to do without having to find an app icon first. Whether it’s seeing a shortcut you didn’t even set up yourself, asking Siri to do something you do all the time, or setting up complex automations, Shortcuts will be a big part of iOS and watchOS starting this fall for everybody.

If iOS is not good at predicting what users want to do, then yeah, Shortcuts will lean more heavily towards the power user who wants to do more with their devices. But if they do it right, this is going to be a reasonably large shift in how people use their devices going forward. Changing iOS from an app-centric platform to an action-centric one is a big change, but it could pay dividends for them.

Side note, consider the supposed home screen changes we’re expecting in iOS 13 next year. I would put a lot of money on shortcuts being a big piece of that redesign as well.

For more information about Shortcuts directly from Apple, check out these developer sessions from WWDC18:


  1. Or even a single word. 

Siri in iOS 12: Apple’s Big Leap

I think Apple made a big leap with Siri at WWDC this year, and it’s notable for being a different leap than we were expecting. This is being read as disappointing to some, and revolutionary to others. Why are some people totally jazzed? Take a look at this quick Siri action in Drafts:

And check out me trying to resize a couple images with voice in Siri and Google Assistant:

Siri has numerous problems today: voice recognition, limited functionality, and unpredictable results are the big ones I can think of right now. These are all areas that Siri lags behind their only real competitor, Google Assistant. And while they could have said they were focusing on making voice recognition better or made general queries more robust, they focused on raw functionality and user control. And they did this by leveraging their greatest asset: third party apps and a developer base keen on keeping up to date with iOS’s latest and greatest tech.

Apple is not going to catch up with Google in terms of general query processing. They’re getting better, but “let me Google that” sort of questions simply will never be as good as Google unless Apple builds their own search engine…and they travel back in time to start building it 10 years ago. This is not a space they can win, and I think they know that, so they didn’t address that up front at WWDC.

Instead, Apple has a rich library of third party apps that people rely on everyday, and they found a way to make Siri work better with those apps than anything possible on Google Assistant. By opening Siri up to perform any action by any app at any time, they have created separation between themselves and the competition for the first time in years. Come this fall, Siri will be able to do literally thousands more things than it can today. With very little work on developers’ part, they will be able to integrate with Siri, and I would expect most of your favorite apps to be updated right as iOS 12 launches. Not only that, power users will have tools like Shortcuts and Drafts to build our own custom actions that we can trigger with Siri as well. If Siri started with like 6 intents in iOS 10 and expanded to 8 or 9 in iOS 11, it’s shooting up to infinity in iOS 12. The sky appears to be the limit.

Now this initial implementation is a little less Siri-like than we’ve seen before. So far Apple’s attitude towards Siri has been “users shouldn’t have to set it up and they should be able to speak to it with natural language,” but that’s less the case with these new changes. Instead of being able to say what you want and having Siri parse that request for you, you’ll have to say very specific things. So if you set up a Siri action to post to Instagram as “post this,” you won’t be able to say “post this to Instagram” or “post a photo to Instagram” because those aren’t your phrase. That’s a limitation, but it is precisely what is enabling Siri to grow so tremendously in functionality in just one year. Also, as my friend Dave on Twitter suggests, this could be a first step towards Siri doing more automatic parsing for us in a year or two.

Siri needs to continue to improve in many ways. It should recognize multiple voices on HomePod, it should be able to set more than one timer at a time, it should be much better at parsing the words we say to it, and it should be more consistent across all devices. It must improve in these areas as well, but those are all things Apple needs to do to play catch up, none of it will separate them from the pack. So while I hope to see improvements in those areas soon (don’t make us wait for iOS 13), I’m impressed with Apple for finding a way to make their voice assistant take a lead in some areas, not just play catch up.

10 Awesome Features Coming in watchOS 5 (podcast episode #96)

Apple unveiled watchOS 5 yesterday and here are my 10 most exciting features I’ve seen and used in the first beta so far.

  1. Automatic workout detection
  2. Advanced running features (steps-per-minute, rolling pace)
  3. Walkie talkie
  4. Podcasts
  5. Audio apps in the background
  6. Grouped notifications
  7. Third party integration to Siri face
  8. Web views!
  9. Better DND
  10. Volume controls

Subscribe to The BirchTree Podcast here.