Google Launches New Tools and Libraries for Building AI Glasses Applications

2025-12-16

Google has introduced two new libraries, Jetpack Projected and Jetpack Compose Glimmer, with the release of the Android XR SDK Developer Preview 3, aimed at empowering developers to build immersive experiences for AI-powered smart glasses. Additionally, ARCore for Jetpack XR has been enhanced to support advanced capabilities such as motion tracking and geospatial positioning tailored for wearable eyewear.

The latest update in the Android XR SDK allows existing mobile applications to extend their functionality to AI glasses by leveraging built-in sensors including speakers, cameras, and microphones, while also displaying contextual information directly on the glasses’ display when available.

In various use cases, your app may need to interact with the hardware of AI glasses. For instance, a video conferencing application could implement a UI element enabling users to switch the video feed from the smartphone camera to the glasses-mounted camera, offering a first-person view during calls.

The first library, Jetpack Projected, enables a primary device—such as an Android phone—to project XR content onto AI glasses using audio, video, or both. It provides APIs to detect whether the target glasses have a display and wait for its availability. Before accessing any hardware features, apps must request runtime permissions in line with the standard Android permission model.

Access to AI glasses hardware is possible either from a dedicated glasses activity or a regular Android app, provided a valid projection context is established. Audio integration is straightforward since the glasses' audio system appears as a standard Bluetooth audio peripheral.

Camera access, however, involves more complexity. Capturing photos or videos requires initializing multiple classes to verify hardware support, configure settings, and bind the camera lifecycle to the host activity so it automatically starts and stops based on the app’s state.

Jetpack Compose Glimmer, on the other hand, offers a suite of UI components and a cohesive visual language designed specifically for AI glasses with displays. This new design system uses optical see-through rendering to seamlessly blend digital overlays with the real-world environment, emphasizing clarity, readability, and minimal distraction. Supported UI elements include text labels, icons, header chips, cards, lists, and interactive buttons. All components are built upon the foundational concept of a surface, which developers can also extend to build custom, non-standard interfaces.

Glimmer components can be customized using modifiers to control layout, styling, and interactivity. They support z-axis stacking with shadow effects to create a sense of depth and spatial hierarchy. To streamline development, Google has also launched an AI glasses emulator within Android Studio, allowing developers to preview UIs and simulate user interactions such as touchpad gestures and voice commands.

Lastly, Google has expanded ARCore for Jetpack XR—a collection of APIs used to develop augmented reality experiences that include plane detection and spatial anchoring. The updated version now supports motion tracking, enabling dynamic responses to head movements, and geospatial pose estimation, which lets developers anchor virtual content to real-world locations powered by Google Street View data.

The Android XR SDK Developer Preview 3 is accessible in Android Studio Canary after upgrading to emulator version 36.4.3 Canary or later.