Google’s life-changing AR smart glasses demo gave me shivers
When Viva and Yoshiko, one an English speaker and the other a Mandarin Chinese speaker, tried Google’s prototype smart glasses during a sneak peak at Google I/O 2022, I got shivers down my spine. The expressions on their faces, the sudden ability to meaningfully communicate with someone who before didn’t fully understand what was being said moved me a great deal. It’s an example of the type of technology that I truly love: One that can change lives.
‘Subtitles for the world’
You’d be forgiven if you’d missed this special moment during Google I/O, as it came right at the end of the marathon event, and lasted for just a few minutes. The actual smart glasses were not named, only shown in a demonstration video, and really only revealed as a concept. Google didn’t even show us the interface itself or hint that the smart glasses will ever be released as an actual product.
It didn’t have to. Google sold the dream. Worn like a normal pair of glasses, the lens incorporates a small screen that shows a real-time translation of another language in augmented reality (AR), so it overlays what you see normally. Google product manager Max Spear summed up the functionality perfectly, saying it was like “subtitles for the world.” Seated opposite someone who doesn’t speak the same language as you, the glasses will provide a text-based translation of the conversation as it happens.
You may be thinking it’s similar to other translation technology — Google’s Pixel Buds have a translation feature, for example — but there are some distinct advantages here. For a start, seeing text on a screen inside a pair of glasses means you can maintain eye contact, you can follow along without pressing buttons or a long, awkward silence as a machine translates what’s being said. Text is less intrusive than hearing another voice, and because no one hears the translation, it doesn’t feel unnatural.
Potential uses
Anyone who has traveled abroad, or spent time in communities where languages differ, will instantly understand how this kind of technology would be of benefit. I got shivers not only because of the joy on the faces of those testing the glasses, but because I immediately thought of how times my own life would have changed if I had access to the same technology.
I remember having dinner with a friend in Japan, and although we both had a basic command of each other’s language, the conversation couldn’t flow. We ended up using Google Translate on a phone and typing rather than using voice through the app because the environment was quite noisy. It worked and was quite fun, but it wasn’t perfect, and at times was pretty awkward. The smart glasses would have changed that situation completely.
I lived in Greece for many years, and while I understand a fair amount of Greek, I can’t speak it well at all. I wonder how Google’s smart glasses and the translation system would have changed my time there? But, as I wonder about both these situations and many others I’ve personally come across where the smart glasses would have been really helpful, I quickly reach the big barrier that is not only Google’s prototype glasses, but any wearable translation tech.
The problem with any wearable device providing visual translations for two people who speak different languages is that all parties need to own and wear one of them. Conversations are two-way things, and if only one of you understands what’s being said, then it only becomes useful in situations where a response is not required. So, to make it work for my scenarios, everyone I know in Japan and Greece will have to wear smart glasses with Google’s translation technology inside. That seems … unlikely.
Where they would work without all parties wearing them is not for translation, but for transcription. This kind of visual transcription and enhancement could clearly be life-changing for Deaf or hard-of-hearing people. My father wears hearing aids, but I know he’d benefit from “seeing” the words, and wouldn’t miss the annoying audio feedback that comes with hearing aids in some situations. It reminds me of how transformational products like the eSight smart glasses are for blind and partially sighted people.
Probably not Google Glass 2
As much as I’d like to think we’re seeing an early version of Google Glass 2, I don’t think we are. Instead, we’ve seen an amazing demonstration of Google’s rapid advancements in the speed and accuracy of its translation and transcription tech.
There were several other examples of Google’s language skills improving during Google I/O 2022. It was announced that another 24 languages have been added to Google Translate to serve 300 million new people around the world. It takes the total to 133 supported languages and has been made possible by a new AI system called Zero-Shot Machine Translation, which learns new languages through a combination of existing knowledge and new information, even if the source of that information is limited.
Google’s AI is getting better at understanding natural language and the way it’s used in general, such as with the Look and Talk feature on the Nest Hub Max, also announced during I/O 2022. That’s before anyone with a Google Pixel 6 tries out Google Assistant’s ability to transcribe your voice into message replies, or watches video with live translations. Both are fast, and particularly in the case of the message replies, shockingly accurate.
I use Google Translate across different devices every day, usually translating Japanese, Korean, and Chinese to English. These are very challenging to do, and to use them effectively in conversation really requires a knowledge of how the language works, otherwise embarrassing mistakes will be made. Hearing and now seeing how Google is innovating and improving its translation tech means my world continues to open up even more, and I think it will slowly make actually learning those languages easier too.
Integrating Google’s enhanced language and translation technology through a pair of smart glasses is enormously powerful. If I can immediately understand the benefits that would bring to me and those close to me, I can only just begin to understand the excitement someone who can’t hear will feel. You can keep the Pixel 7 and Pixel Watch — demonstrations of incredible future technology like this are the reason I sit through more than two hours of Google I/O keynote presentations, and those shivers as I begin to understand how potentially transformational it all could be are my reward.
Editors’ Recommendations