Toggl News

“Hey Siri, start Toggl” – Introducing Siri Shortcuts

With the iOS 12 update, Apple made a big improvement with Siri, allowing you to customize it and control a lot of your apps through Siri Shortcuts – and that includes our Toggl app! Here’s what you can expect.

Siri – more than a voice assistant

Every iOS user has gotten used to Siri: you can ask her to set timers, create events on the calendar, add items to lists, get directions and much more. Even if it seems limited compared to what other voice assistants do, it’s already pretty impressive. You can even ask her to tell you a joke or to marry you (good luck with that).

But Siri, as Apple understands it, is not only the voice assistant that tries to answer every single one of our questions. Siri is also the recommendation system under the hood which, for example, also works through a visual interface when you access spotlight search or through the lock screen.

For it to work as good as possible Apple collects information from different sources, some external to the device, like the weather or encyclopedic knowledge and some internal, like the usage the owner does of the iPhone itself. Siri knows what apps you use, when do you use them and where are you when you do it. That way it can suggest specific apps for specific situations.

Until iOS 11 the information Siri would use and the kind of suggestions she could give were somewhat limited. And only apps in some specific domains could leverage Siri’s voice interface, like messaging, payments, workout, car sharing and some other. But that is changing dramatically  – with iOS 12, Siri is opening up to every app that wants to make use of it.

Power to the developers

With iOS 12 Apple is giving us, the developers, more options to educate Siri in what our app can do. Now apps can inform Siri about specific actions that the user repeats inside the app. Siri uses this information to learn about user’s routines and together with other device signals like location, time of day and motion it can use some machine learning to find usage patterns and suggest that action in the future.

Those suggested actions, that Apple calls shortcuts, will be shown in the search screen and lock screens and when tapped they will open an app or execute that specific action in the background (if possible).

The way developers can tell Siri what actions inside the app should be a shortcut, is either by using a NSUserActivity, an API that’s been with us since iOS 8 and is used for things like handoff or continuity, or by using the more complex but much more flexible Siri intents.

An intent is an action that can happen in the app together with optional parameters for that action. For example, in a coffee app, an action could be “order coffee”, and the parameter would be the kind of coffee the user wants to order (latte, cappuccino…). In our case the action could be “start timer” and we would define multiple parameters for the entry description, project, tags or workspace.

It’s important to understand that these intents should be actions that the user is likely to repeat in the future. Meaning, it makes sense to create an intent from an action that stops a timer or orders coffee, but it probably doesn’t make sense to create an intent from an action that orders a new TV or a new iPhone, because those are things that the user won’t (probably) be repeating very often.

Voice Triggers

So, now we’ve let Siri know that there are some actions in our app that we want the users to be able to repeat in a fast and convenient t way through the suggestions that Siri provides. But this, of course, doesn’t stop here. Now users will also be able to record phrases which Siri will understand as triggers for those predefined actions.

Through Siri’s settings (though soon we’ll add access directly from the app) the user can take any of those repeatable actions and create a trigger for it by recording a phrase that Siri will understand. This makes it possible that by just bringing up Siri and saying “Stop timer”, for example, the current entry running in Toggl stops. We will suggest a phrase that the user can set as a trigger, but he can choose to record his own – the only limit is his imagination.

Actually, we’ve made a nice little landing page which can help you out to set up Toggl & Siri shortcuts (there’s also a small giveaway involved)

After firing up an action, Siri has two options – either she brings up the app to perform whichever intent we triggered, or she takes care of it through what we call an app extension. This is some code that runs outside our app but has access to some shared information. The good thing about this is it can perform some tasks without running the app and with no need of leaving Siri’s interface. Siri will make her best to answer to our request via voice and a small view where we can show some information.

This also means those background intents can be used not only through the iPhone or iPad, but also through an Apple Watch or Home Pod, as we don’t really need to see the app to perform some of these actions.

In our case we have some intents that open the app to show reports (though we might also do that through an extension in the future) and the rest can work through the extension without opening the app: stop timer, start timer and continue a specific time entry. They just do their thing and then report back with some information about what happened (with both visual and voice feedback)

The Shortcuts App

Apart from this triggers we can create from our app, Apple also included another novelty with iOS 12, and that’s the Shortcuts app. Although it’s not really new, as it is, after all, an already existing app called Workflow and which was bought by Apple in 2017. What Apple did, though, was adding the ability to include Siri shortcuts in the app, and also renaming the app to Shortcuts which makes the whole thing a bit confusing, to be honest.

Shortcuts (previously known as Workflow) is an app that allows users to chain different actions together, connecting the steps to create an automation workflow (hence the old name) that will run all the steps one after another when the user executes it.

Before iOS 12, a lot of stuff could be done with Workflow, but now in Shortcuts we are also able to add custom intents from all of our apps that support them. And not only that, but we will also be able to trigger these workflows through a Siri command. This will allow users to create a shortcut that gets triggered when you say “I’m hungry” and makes a reservation in your favorite restaurant, sends a message to Slack telling your teammates that you are stopping for lunch, stops the Toggl timer and opens up Maps with directions to the restaurant.

The possibilities are endless.

Limitations

Everything I’ve said until now sounds great, right? And it is, since all of these features are going to be big for apps like Toggl, where main features could be handled through a voice interface easily. But there’s a major limitation at this point.

In the current iteration of Siri Shortcuts the intents we create in our apps can’t have input or output parameters. Actually, they do have input parameters and Siri can answer different responses depending on the result of the action. But those parameters are set from the app and cannot be changed when invoking the shortcut. This means you can create an action to start an entry in Toggl called “Lunch time”, but you can’t create an action to start any entry and expect Siri to understand or ask you for the description.

This also means that you can’t get that information from the previous step in a workflow created with the Shortcuts app. You can do that for some steps, but not for those that come from the new Siri shortcuts.

This will likely change in the next iteration of Siri, expected in September 2019. Not everything is lost just yet, since we might have a workaround for this that will allow you to create new entries and specify the description by voice. Stay tuned for app updates!

Final thoughts

To be honest, I don’t know if this is the future of applications for years to come, but voice interfaces are how science fiction movies always told us interaction with machines should happen. These changes aren’t making Siri “smart” overnight, but they try to give the developers the tools to get on that train and also empowers the users by letting them specify what domains are they interested in, instead of letting the AI do all the guessing. And in the process, Apple is probably using this information to actually make Siri smarter as well.

It does seem like a very natural way of telling a computer to do something. Much in the way than, when the first iPad was released (due to its natural UX), there were videos of toddlers (or cats) playing with it like they’ve been doing it forever. I now have videos of my 5-year-old talking to and even fighting with Siri over some cartoons she refuses to show her.

Still, for adults, at this moment it might be a little awkward to walk down the street giving voice commands to a machine– people might look strange at you at first. And I should know, I’ve worked on this feature from a coworking space and from a train (Siri does hear whispering, though). But this will probably change gradually – and if you don’t believe me, try to remember the first time you saw someone speaking through his phone on the street (yeah, what a douche).

By On September 28, 2018

  1. Nice update. There was one thing I thought about though:

    “This also means those background intents can be used not only through the iPhone or iPad, but also through an Apple Watch or Home Pod, as we don’t really need to see the app to perform some of these actions.”

    Is it really supported in the watch yet? Every time I try a Shortcuts command on the watch it says “I do nit recognize that command”.

    • Hey, yes it is supported out of the box by Apple. Meaning we as developers don’t have to do anything to add that feature in the Watch. Do all apps fail for you in the Watch or only ours? Also, check that you’ve updated the Watch to the latest OS as well.