At the latest The Android Show, XR Edition, Google has unveiled an exciting development that promises to reshape the landscape for app developers. The tech giant announced a unified platform for the developers of both Android and Android XR, paving the way for seamless app deployment across upcoming wearable devices. This initiative coincides with significant advancements in Samsung’s smart glasses, a deeper rollout of new SDK capabilities, and expanded engine support for the XR platform.
Android developers gain a unified pathway into the expanding XR ecosystem
One of the most noteworthy announcements from the show was the integration of app development for both Android and Android XR. Developers now have the ability to build applications once and deploy them across the entire Android XR device family. Although Google confirmed Samsung’s involvement in the development of smart glasses, specific details regarding the launch timeline and hardware specifications remain undisclosed.
Developers are provided with early access to start creating applications and services tailored for the upcoming AI glasses. Tools like Jetpack Compose Glimmer enable lightweight overlays, while Jetpack Projected allows existing Android apps to seamlessly extend their functionalities directly onto the smart glasses. Thanks to Gemini integration, users can look forward to features such as real-time translation, visual search, and contextual assistance, enhancing the overall user experience.
Android XR expands compatibility to a wider range of third-party devices
The Android XR platform is also breaking new ground by expanding compatibility to a wider array of third-party devices. This includes wired XR glasses like Project Aura from XREAL, which is set to launch next year. This move not only broadens the range of supported display types but also encourages developers to craft experiences that adapt fluidly across varied display formats.
The developer preview of the Android XR SDK introduces new headset API improvements, fully unlocking the potential for AI glasses development. This update also includes ARCore upgrades for Jetpack XR, which now features geospatial capabilities that support accurate wayfinding and location-based content. Additionally, the new Field of View API enables applications to adapt their layout to different headsets and wired glasses. Developers can leverage the native Android and OpenXR capabilities of Unreal Engine to create innovative projects.
With these advancements, Google is clearly setting the stage for a vibrant ecosystem where developers can thrive. As the Android and XR platforms continue to evolve, they promise to unlock a world of opportunities for users and developers alike.
For more information, you can read the full details Here.
Image Credit: www.androidheadlines.com






