The Evolving Terminology of Smart Glasses: AI Glasses or XR Devices?
I recently attended Google’s Project Aura demo, where I posed a simple question: what do we call these innovative face-worn devices? To my surprise, this sparked a lively debate about the taxonomy of glasses-shaped computers. It seems that the term “smart glasses” is fading out, while “AI glasses” is gaining traction. However, there’s no consensus on this terminology.
The conversation began earlier this year when a Meta communications representative requested I refer to Ray-Ban Meta glasses as “AI glasses.” It appears that Meta CEO Mark Zuckerberg and CTO Andrew Bosworth believe these glasses serve as an ideal platform for AI integration. By branding these glasses as “AI devices,” it distinguishes them from earlier iterations like Google Glass and prioritizes artificial intelligence over augmented reality (AR) functionalities as a selling point.
This is Project Aura. Different companies are branding their products in varied ways; for instance, Xreal calls these AR glasses, while Google labels them as wired XR glasses. Google’s definition of “AI glasses” seems inconsistent. While discussing with Juston Payne, Google’s director of product management for XR, he characterized AI glasses as stylish, lightweight devices that may or may not feature displays, where AI plays a crucial role in enhancing the user experience.
Interestingly, during the demo, it became clear that Project Aura does not fit neatly into this classification. Google regards it more similarly to headsets, as it operates through a wired connection with a battery/trackpad puck. Their press release referred to it as “wired XR glasses,” underscoring the device’s broader functionalities.
There’s a certain logic to this classification. Project Aura is a collaboration with Xreal, which also creates products positioned between AI glasses and headsets. I had a conversation with Xreal CEO Chi Xu, who humorously referred to their lineup as AR glasses, showcasing a unified nomenclature from their end.
The delineation between virtual reality (VR) and augmented reality (AR) has historically been clear-cut: VR immerses users in a fully virtual environment, while AR overlays digital information onto real-world backdrops. However, the introduction of mixed reality (MR) and extended reality (XR) has muddied these waters. MR refers to the blend of virtual and actual environments, while XR encompasses all forms of advanced reality technologies.
Traditionally, the categorization of these devices aligned with their form factors. VR required headsets, whereas AR adapted better into glasses. However, that is changing. The latest headsets introduce mixed reality capabilities, often blurring the distinctions of naming conventions. For instance, while Project Aura may meet the criteria for AR, it sits firmly within the MR category.
The evolving terminology suggests that we are headed toward a future where gadgets will be classified based on their usage scenarios instead of rigid definitions. AI glasses may become everyday wearables, ideal for short interactions, potentially replacing smartphones in the long run. Conversely, headsets are better suited for episodic use, tied to specific tasks or entertainment purposes.
Despite the shifting landscape, I remain skeptical about adhering strictly to the term “AI glasses.” It may not fully encompass the various functions these devices can perform. Perhaps more precise terminologies will emerge that highlight individual features, such as whether a pair of smart eyewear has a display, camera, AI functionalities, or Bluetooth capabilities. As the landscape of smart eyewear continues to evolve, the question remains: what should we collectively label this new generation of AI-infused face computers?
For more insights on this topic, you can read the original article Here.
Image Credit: www.theverge.com







