Google Assistant is already pretty handy, filling in your payment info on take out orders, helping get the kids to school on time, controlling your stereo systems’ volume and your home’s smart light schedules. At the I/O 2022 developers conference on Wednesday, company executives showed off some of the new features arriving soon for the AI.
The first of which is “Look and Talk.” Instead of having to repeatedly start your requests to Assistant with “Hey Google,” this new feature relies on computer vision and voice matching to constantly pay attention to the user. As Google’s VP of Assistant, Sissie Hsiao, explained on stage, all the user has to do is look at their Nest Hub and state their request. Google is also developing a series of quick commands that users will be able to shout out without having to gaze longingly at their tablet screen or say “hey Google” first— things like “turn on the lights” and “set a ten-minute alarm.”
All of the data captured in that interaction — specifically the user’s face and voice prints, used to verify the user — are processed locally on the Hub itself, Hsiao continued, and not shared with Google “or anyone else.” What’s more, you’ll have to specifically opt-in to the service before you can use it.
According to Hsiao, the backend of this process relies on a half dozen machine learning models and 100 camera and mic inputs — ie proximity, head orientation and gaze direction — to ensure that the machine knows when you’re talking to it versus talking in front of it. The company also reports that it has worked stridently to make sure that this system works for people across the full spectrum of human skin tones.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.