At Google I/O, the live-translation glasses blew our minds


Leave Google Glass behind. Google provides quick translation specifications.

image credits: future

Yes, you read that correctly. Google ultimately blew our minds with eyeglasses that literally translate what someone is saying to you in front of your eyes during Google I/O 2022 — if you’re wearing the glasses, of course.

The glasses appear to be completely natural, work without the use of a phone, and have almost made us forget about Google Glass.


In reality, little is known about the spectacles beyond a brief video presentation shown at the end of a nearly two-hour event — a Steve Jobs-style “one more thing” moment.

The glasses use augmented reality and artificial intelligence (together with presumably embedded cameras and microphones) to observe who is speaking to you, hear what they are saying, translate it, and display the translation on the embedded, translucent screens incorporated into the eyeglass frames in real time.

“Language is so crucial to interacting with one other,” Google CEO Sundar Pichai remarked, but he added that following someone who speaks a different language “may be a significant challenge.”


According to Pichai, the prototype lenses (luckily not called “Google Glass 3) employ Google’s translation and transcription advances to provide translated words “in your line of sight.”

A young woman describes how her mother speaks Mandarin and she speaks English in the video. Her mother understands her but is unable to communicate with her in English.

The young woman puts on the black horn-rimmed glasses and watches what the researcher is saying on a screen, which is transcribed in yellow English characters. Granted, what we’re seeing in the video is an overlay, not what the woman is experiencing.


What is evident is that, unlike with Google Glass, no one is looking upward to see a strangely placed, tiny screen. Because the prototype allows users to gaze straight at the speaker, the words are superimposed on top. We see a point-of-view image of the translation at work in one segment. This is another Google prototype view graphic. Until the prototype leaves the lab, we won’t know how they actually work.

This project’s completion and shipment date were not specified by Google. We don’t even have a name for ourselves. Seeing normal-looking augmented reality glasses that potentially address a very real-world problem (translating sign language for someone who doesn’t know it or showing words to the hearing handicapped) is nevertheless exciting.

“Kind of like subtitles for the world,” one researcher said in the video.


Leave a Comment