Smart glasses: How they work and what’s next

For centuries, glasses’ primary purpose has been to improve our vision to 20/20. But now that 2020 is upon us, eyewear manufacturers and internet pioneers are joining forces to make our one-trick glasses smarter too.

What the hell are smart glasses? Simply put, they are an attempt to bring the wireless connectivity and images we enjoy to our computers and cell phones into the frames and lenses of our glasses.


Just as we can no longer imagine living without a laptop or mobile phone, we will soon be able to enjoy the same versatility and connectivity as our glasses and even our contact lenses. A real revelation, don’t you think?

Google Glass leads the way
Google was the first to launch this new vision for glasses in 2013 with the introduction of Google Glass Explorer, hoping to capitalize on the popularity of smartwatches and other wireless wearable devices.

Unfortunately, Explorer proved to be too geeky, cluttered and expensive ($ 1,500) for the most part, prompting Google to pull it off the market after 18 months.

How smart glasses work
However, Google Glass turned out to be a valuable archetype of smart glasses that other tech players would soon perfect. Here’s how Google Glass brought intelligence into smart glasses:

Sound: The speaker for wireless audio inputs and cell phone reception is located at the end of the head restraints. Audio is transferred to the ear through bone conduction rather than air conduction through the audio channel.

Intelligence: The brain of the computer’s central processing unit (CPU) sits in the arm of a headrest.

Microphone – The microphone for hands-free voice search and cell phone conversations is located under a hinge. Today, most smart glasses combine the microphone with a micro speaker to receive notifications and audio commentary, as well as to listen to music and podcasts.

Projector and prism – Located on the top of the lens, this projection method, called a curved mirror or curved mirror combiner, offers partially transparent digital views without obscuring the real world view. Some manufacturers now offer an alternative version called holographic waveguide optics. The digital overlay of text and images in our field of vision is the key to unlocking the smart glasses experience.

Camera: While an obvious feature in our age of selfies, the camera lens on Google’s eyeglass temple brought with it a new unwanted experience: privacy. Many viewers weren’t thrilled that they were filmed and essentially rescued without their permission, a reaction that may have accelerated the explorer’s departure. While smart manufacturers now make camera lenses small enough to discreetly fit into the frame of their products, some, including North’s Focals and Vue, now offer camera-free models.

Powered by touch, word or thoughts
Perhaps as fascinating as the visual overlays on smart glasses are the various ways you can control them.

Instead of the keyboard and mouse we’re all used to, we can control smart glasses by tapping, tapping, or sliding the controls built into the frame, verbalizing our requests as we do with Alexa and Siri, and / or directing their screens through our phone. . or portable devices like the Focals by North ring.

Other options available to smart eyewear manufacturers include recognition of head, eye and hand gestures, such as nodding or looking up or down, directing through eye tracking, and even checking our eyes. glasses with our thoughts (really!).

Smart glasses and better vision
The developers also haven’t overlooked the obvious visual role of all glasses: seeing better.

Several models have incorporated liquid crystal technology to allow users to filter the level of brightness that comes through their smart lenses. Controlling the amount of ambient light in their natural environment also helps wearers optimize the visual overlays of their smart lenses.

Filtering out glare is a technological step towards photochromic or transitional lenses and could make sunglasses superfluous.

Leave a Comment