
(Image credit: Tete Escape)ShareShare by:
- Replicate link
- X
Share this article 0Participate in the discussionFollow usInclude us as a favored source on GoogleNewsletterSign up for our newsletter
It has been over a decade since the unveiling of Google Glass smart eyewear in 2013, followed by its prompt removal — partially attributed to restricted acceptance. Its subsequent (and less publicized) second rendition saw release in 2017 and geared toward professional settings. They were discontinued in 2023.
During December 2025, Google made a fresh commitment to smart glasses — with a pair of novel offerings slated for launch in 2026. But why have Google’s attempts with smart glasses faltered when others have flourished? And can Google achieve success on its third attempt?
You may like
-

‘It won’t be so much a ghost town as a zombie apocalypse’: How AI might forever change how we use the internet
-

Why the rise of humanoid robots could make us less comfortable with each other
-

Some people love AI, others hate it. Here’s why.
These comprise the categories of adornments that have developed over centuries and are presently embraced as commonplace within society.
Some of the latest scholarly studies are utilizing this strategy, incorporating sensors into ornaments that consumers would genuinely choose to adorn. Investigation has formulated a gauge to assess the communal acceptance of wearable innovations (the WEAR scale, or Wearable Acceptability Range), encompassing inquiries like: “I anticipate my associates would regard this device as acceptable to wear.”
Noreen Kelly, of Iowa State University, alongside peers, indicated that, fundamentally, this scale evaluated a pair of facets: that the apparatus facilitated people in attaining an objective (rendering it worthwhile to use), and that it did not engender apprehension regarding confidentiality and perceptions of discourtesy.
This subsequent matter was prominently underscored by the label that surfaced for Google Glass adopters: Glassholes. While numerous studies have deliberated the prospective advantages of smart glasses, spanning mental wellness to application in surgical procedures, confidentiality anxieties and supplementary considerations persist regarding contemporary smart glasses.
Having stated all of that, “look-and-feel” is consistently cited as the paramount issue among potential clients. The foremost triumphant goods have been fashioned to function as alluring accessories first, with sophisticated technologies taking a secondary role. Frequently, in fact, via renowned designer labels.
A fine spectacle
Subsequent to Google Glass, Snapchat introduced smart eyewear dubbed “spectacles,” boasting embedded cameras, emphasizing aesthetics, and gaining easier integration into societal norms. The most recognized smart glasses currently were launched by Meta (the parent organization of Facebook), in partnership with fashion houses such as Ray-Ban and Oakley. These offerings frequently feature front-facing cameras and support for conversational voice assistants derived from Meta AI.
Consequently, what should we anticipate from Google Smart Glasses in 2026? Google has pledged a duo of commodities: one dedicated solely to audio, and the other showcasing “displays” projected onto the lenses (akin to Google Glass).
You may like
-

‘It won’t be so much a ghost town as a zombie apocalypse’: How AI might forever change how we use the internet
-

Why the rise of humanoid robots could make us less comfortable with each other
-

Some people love AI, others hate it. Here’s why.

The initial iteration of Google Glass debuted in 2014.
The prevailing assumption (grounded in the promotional content) is that these will exhibit a notable alteration in configuration, transitioning from the avant-garde, if not unsettling and unfamiliar layout of Google Glass, toward something mirroring conventional eyewear.
Google’s unveiling also spotlighted the incorporation of AI (indeed, they introduced them as “AI Glasses” rather than merely smart glasses). Nevertheless, the two product variations (audio-centric AI Glasses and AI Glasses incorporating visual overlays) aren’t particularly revolutionary, even when paired with AI.
Meta’s Ray-Ban range delivers both options, further enriched by voice interactions via its proprietary AI. These have demonstrated greater achievements relative to the recent Humane AI Pin, for example, which was complete with forward-facing cameras, additional sensors, and voice support originating from an AI agent. This device bore closest resemblance to Star Trek’s renowned lapel communicators.
Direction of travel
Likely, the core avenues of innovation in this domain involve, first, diminishing the bulk of smart glasses, which have inevitably necessitated substantial dimensions to house electronics while preserving a seemingly typical form factor.
“Crafting glasses that you’ll want to use” represents Google’s approach, implying that we might witness innovation derived from the corporation solely aimed at elevating the aesthetic allure of smart glasses. They are likewise collaborating with widely recognized brand associates. Google also advertised the launch of tethered XR (Mixed Reality) glasses, presenting a form factor drastically minimized contrasted to Virtual Reality headsets presently available.
Secondly, augmented integration with other Google offerings and capabilities could be anticipated, wherein Google holds a larger portfolio of commonly accessed products than Meta, notably Google Search, Google Maps, and GMail. Their marketing resources showcase instances of visualizing Google Maps data via the AI Glasses amidst pedestrian navigation.
RELATED STORIES
—Holographic-inspired lenses could unlock ‘3rd dimension of imaging’ in future VR headsets and smart glasses
—Snakes’ mind-bending ‘heat vision’ inspires scientists to build a 4K imaging system that could one day fit into your smartphone
—’Super-vision’ contact lenses let wearers see in the dark — even with their eyes closed
Lastly, and arguably the most promising avenue for advancement, lies in innovating upon the integration of supplementary sensors, possibly interfacing with alternative Google wearable well-being products, given the observation of numerous ongoing initiatives, including unveiling their own smart ring offerings.
Extensive research has centered upon elements capable of detection via typical cranial touchpoints, encompassing heart rate, thermal reading, and galvanic skin response (dermal humidity, which fluctuates with instances like duress), and possibly even encephalic activation by means of EEG techniques. With the present breakthroughs in consumer neurotechnology, we could foreseeably witness Smart Glasses employing EEG to track cerebral data within the coming several years.
This revised piece has been republished from The Conversation in accordance with a Creative Commons license. Access the original write-up.

Max L WilsonAssociate Professor of Human-Computer Interaction, University of Nottingham
Max L. Wilson serves as an Associate Professor in Human-Computer Interaction, nested within the Mixed Reality Lab, coupled with assuming responsibility as Director of Student Experience within the School of Computer Science. His research, funded by EPabc, European, and Google, is channeled into scrutinizing fNIRS brainwave recordings that convey data pertaining to mental exertion along with diverse cognitive processes, effectively portraying it as a kind of personalized metric primed for assessing technology and job-related activities.
This endeavor stems from his former investigations centered around gauging user interfaces that facilitate information interaction. Max holds positions on the steering councils governing both ACM CHI and ACM CHIIR gatherings, concurrently holding membership within the SIGCHI Conferences Working Group, and contributing as a Deputy Editor for the International Journal of Human-Computer Studies.
Show More Comments
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.
LogoutRead more

‘It won’t be so much a ghost town as a zombie apocalypse’: How AI might forever change how we use the internet

Why the rise of humanoid robots could make us less comfortable with each other

Some people love AI, others hate it. Here’s why.

‘Putting the servers in orbit is a stupid idea’: Could data centers in space help avoid an AI energy crisis? Experts are torn.

‘Artificial intelligence’ myths have existed for centuries – from the ancient Greeks to a pope’s chatbot
