
"With the release of the Android XR SDK Developer Preview 3, Google has introduced two new libraries to help developers create AI Glasses experiences, Jetpack Projected and Jetpack Compose Glimmer. ARCore for Jetpack XR has also been expanded to work with AI Glasses, adding motion tracking and geospatial capabilities. The new libraries introduced with Android XR SDK Developer Preview 3 allow developers to extend existing mobile apps to interact with AI Glasses by leveraging their built-in speakers, camera, and microphone,"
"The first library, Jetpack Projected, enables a host device, such as an Android phone, to project an app's XR experience to AI Glasses using audio and/or video. The library allows apps to check whether the target device has a display and wait for it to become available for use. Before an app can access device hardware, it must request permission at runtime in accordance with the standard Android permission model."
Android XR SDK Developer Preview 3 provides Jetpack Projected and Jetpack Compose Glimmer to enable AI Glasses experiences and extends ARCore for Jetpack XR with motion tracking and geospatial capabilities. Jetpack Projected lets a host device project audio and video XR experiences to AI Glasses and detect display availability. Apps can access glasses hardware from both AI Glasses activities and standard apps with a projected context. Audio behaves like a standard Bluetooth audio device. Camera capture requires class instantiation, hardware checks, setup, and lifecycle binding so the camera opens and closes with the activity. Jetpack Compose Glimmer supplies UI components and a visual language for augmented wearable displays.
Read at InfoQ
Unable to calculate read time
Collection
[
|
...
]