The most fascinating feature unveiled at Google IO, Google’s annual developer conference, was a Google Assistant feature called ‘Continued Conversations.’
If you’ve used a voice assistant of any sort, be it Alexa, Google Assistant, Siri or even Samsung’s still-in-beta Bixby, you’re aware that the interaction is not fun. Unless you’re issuing simple commands (turn on the flashlight, call mom, etc.), these digital assistants feel more like voice UI than a natural mode of interaction. It’s like you’re replacing typing with text-to-speech. It’s faster than typing at times, but it’s not necessarily the most convenient or intuitive option.
Continued Conversations changes all of this. The feature, coming to the new flagship Pixel devices later this year – most likely in October – will allow you to have a conversational interaction with an Google Assistant.
It’s hard to describe just how awesome that feature is so we’ll just let this video do the talking, literally.
From responding contextually to performing complex tasks within apps, the functionality that the new Assistant brings to the table means that we’re looking at the next generation of smartphone UI.
That killer feature that will rid you of your screen habit? You’re looking at it.
Key to this monumental achievement is a tonne of geekery that’s, in simple terms, has reduced a complex algorithm that takes 100 GB of data and runs on a server farm to something that’s small enough and light enough to run on your smartphone. It’s hard to understate exactly how epic this achievement is.
Running that algorithm on your device means that you no longer need a data connection, that there’s basically zero latency and that your data can be very private indeed. The best part is that the algorithm can learn from your usage and adapt as needed.
The best part is that not once do you have to invoke Assistant with something archaic as ‘Hey Google’.
Assistant will also get more personal. It’ll learn from your actions and provide contextually aware responses to your queries. Book me a cab to Mom’s House, for example, could refer to the address where your mom lives or to a shopping complex called “Mom’s House”. The new, smarter Assistant will figure out which one you’re referring to by learning from your interactions with the phone.
You could even present a query like “What should I make for dinner?” and Assistant will give you suggestions based on your preferences in the past. It’ll also have the lens to recommend breakfast recipes in the morning and dinner recipes at night, for example.
Duplex is coming to the web. Yes, Duplex, that feature Google demoed last year where you could have your phone call up and book a dinner reservation for you without you having to go through the trouble of having an actual conversation.
In its web avatar, Duplex will do things like autofill forms with a mere tap or voice command. Say you’re booking a rental car, you just ask Assistant to ‘Book a car with National for my next trip’ and Assistant will do all the grunt work for you. Everything from navigating to the web page, filling in your personal date, email ID, dates and times and more will be handled directly by Google.
Assistant will get more inclusive by learning the needs of the differently-abled. Whether you have a speech impairment or are limited to facial gestures owing to a stroke or debilitating disease, Google is working on designing Assistant in such a way that it will learn that person’s speaking style or gestures and perform actions as needed.
A person who is, say, paralysed, can train Assistant to say the word ‘Water’ when the person opens their mouth. A person with a thick accent could also train Assistant to understand their manner of speaking and provide better translation.
A more convenient driving mode is coming soon. In the new driving mode, Assistant will switch to a much simpler, voice-friendly UI. Features that you’d need while driving, for example, are highlighted. These include navigation, messaging and media.
When entering the mode, you’ll also see intelligent suggestions based on your calendar or routine. A scheduled meeting at a hotel, for example, will have Assistant suggesting it as your destination.
This feature should be arriving “this summer” on phones that support Google Assistant.
By far the most exciting feature, for a sleepy-head like me, anyway, is that you can just shout ‘Stop!’ when that pesky alarm rings at 6 in the morning. You no longer need blindly flail about in a half-asleep state as you desperately try to turn off that alarm that disturbed your slumber.
This feature is coming to Google Assistant-supported smart displays and speakers.