Looking at how the voice assistant space moves forward, there's one recurring question that everyone asks - what's going on with Siri? Is it going to catch up with Alexa and Google Assistant?
One of the products announced during the last WWDC was iOS 13. While the announcement put a lot of emphasis on features like dark mode or improved privacy, Apple managed to sneak major improvements in how it enables voice interactions.
For most people, voice interactions mean convenience. We use our smart speakers to multitask and avoid pulling out our mobile. We use voice assistants on our smartphones to perform certain actions quicker.
But there's another side to that. Voice is a technology that can empower a lot of people when it comes to accessibility. It gives the power to interact with technology despite mobility impairments. Omnipresent touch screens and tiny elements that require a lot of precision to click don't make it easy.
Voice Control, a new accessibility feature in iOS 13, gives users a chance to user voice perform any interactions with their computers or mobile devices.
It's powered by the same technologies that make Siri possible, but it works in a slightly different way.
Most importantly, it doesn't require a wake word. Saying "Hey Siri, open Mail. Hey Siri, click compose" is far from convenient. In iOS, it can use front-facing camera (the one used for Face ID in iPhone X and above) to detect whether user's attention is on the device. This prevents accidental "clicks" made with voice.
Another thing is that unlike Siri, Voice Control provides really simple commands (like "Open <APP>", "Tap <BUTTON%gt;" etc.). Other input is treated as dictation by default.
And, compared to Siri, Voice Control should work with existing apps out of the box. Should, because it relies on accessibility annotations in the UI, the same as VoiceOver that's been available in iOS for a long time. During our testing, we noticed that some apps don't handle that well and make some elements inaccessible.
A major problem for Siri's growth has been its closed platform. Unlike Alexa or Google Assistant, as developers we're not able to build fully-conversational experiences for Siri. For some time Siri Kit enabled developers to connect their apps to a few built-in intents, like ordering a Taxi or sending a text message. This was however strictly controlled by Apple and blocked Siri from supporting more complex use-cases.
What we're seeing in the recent developments or Siri is growing importance of shortcuts as a way for 3rd party apps to work inside Siri. Last year Apple launched SDK allowing developers to install Siri Shortcuts for actions within their apps. This brought a lot of new possibilities for 3rd party Siri interactions (still very simple though).
iOS 13 brings the idea of shortcuts forward - instead of simple interactions that lead to opening an app, we'll be able to create what Apple calls "Conversational Shortcuts". Just like with Alexa Skills or Google Actions, developers will be able create shortcuts that involve simple conversations.
We can still see Apple's focus on Siri on iOS, rather than smartspeakers. Looking at the bigger picture, that's what we should expect on their side. Amazon Echo's and Google Home's popularity is largely driven by the sale low-price smartspeakers, an area where Apple doesn't want to compete. Instead, they seem to look at Siri as value added to their smartphone-centered ecosystem.
What Apple emphasized a lot during WWDC was the new Siri voice. They've put a lot of effort into making it sound more natural.
It's great for its users - after all how we hear voice assistants matters a lot and robotic voice can be tiring and unappealing. Personally, I was hoping that Apple would do more progress in terms of using Siri locally on the device (just like Google Assistant will soon) and the speed of voice recognition. It's still not there, but it will have to happen soon in order to keep up with Google Assistant's UX.
We can see Siri's slowly trying to catch up with competition. There's a lot to work on, especially given latest Google Assistant developments, but it's great to see progress.
What's most exciting is seeing Siri opening up to developers via Siri Shortcuts. They seem like a good starting point for Apple, especially given that most Siri interactions still happen on mobile.
And of course voice control, a little feature that shows that voice is not only about convenience - it can change many lives.