Photo credit: www.androidcentral.com
What You Need to Know
A recent examination of Android Auto’s code has led to intriguing revelations regarding potential integration with smart glasses. The code includes references to a “Glasses” option and suggests that users may soon have the capability to initiate navigation through their glasses. Notably, this discovery comes just after Google’s presentation at TED 2025, where the company highlighted its Android XR glasses and their advanced “memory” features.
This insight into the possible evolution of Google’s driving interface was revealed by tipsters who conducted a thorough analysis of the APK file, linked to a report by Android Authority. Within the new version of Android Auto, two noteworthy strings referencing “GLASSES” were identified. The initial string simply denotes a “Glasses Option.”
While this is a preliminary finding without detailed information, it seems to indicate that Google is developing a feature that may be included in forthcoming settings or functionalities.
You May Like
The second string notably suggests that users could “Start navigation to launch Glasses.” This implies a potential alignment between Android Auto’s interface and a user’s AR smart glasses, offering real-time guidance while on the road. While there’s much speculation about how this navigational information might be presented, specifics remain unclear.
This code was detected in version 14.2.151544 of Android Auto, suggesting that the integration is still in its infancy, and more developments are needed before it becomes available to consumers.
Google’s Future of Android XR Glasses
The discovery of these code snippets dovetails perfectly with Google’s recent TED 2025 showcase, where the spotlight was on its innovative Android XR glasses. One of the standout features demonstrated involved the glasses’ augmented reality capabilities, particularly their ability to assist users with memory tasks. Product manager Nishtha Bhatia showcased this by querying the AI, Gemini, regarding the location of her hotel room key while wearing the glasses. The AI accurately directed her to its location, demonstrating the potential of blending AI with augmented reality.
During the presentation, Google emphasized that these glasses operate in conjunction with smartphones, enabling seamless access to applications while keeping the device lightweight. This suggests that the Android Auto integration may similarly prioritize efficiency, potentially accessing essential data directly from the vehicle’s onboard systems rather than requiring extensive processing within the glasses themselves.
Google’s previous demonstrations have also included a range of designs for these glasses, from models featuring a single display lens to those with dual screens, indicating that there are several options in development. This diversity suggests a commitment to catering to different user needs and preferences as the company continues to refine its vision for augmented reality and automotive technology.
Source
www.androidcentral.com