
"Last week, within the confines of Google's Hudson River office, I put on a pair of Android XR glasses and began to converse with Gemini as I walked around the room. These weren't the Warby Parker or Gentle Monster models that had been teased at Google I/O in May, but rather a developer kit that will soon be in the hands (and on the faces) of Android developers worldwide. The demos, ranging from visual assistance to gyroscopic navigation, progressed swiftly and, to my surprise, with high rationale. At one point, I asked Gemini to give me a fruit salad recipe with the pasta on the shelf, only for it to recommend a more traditional tomato sauce dish instead."
"Google's plan for AI glasses comes in two forms: one that's audio and camera only, similar to Meta's Ray-Bans, and another that integrates a display for visual cues and floating interfaces, like Meta's Ray-Ban Display. Clearly, there's some competition in the space. However, Google has one key advantage before it even launches: a well-established software ecosystem, with Developer Preview 3 of the Android X"
Google announced three advances in Android XR, led by display-capable AI glasses initially targeted to developers. A developer kit enables conversational multimodal interactions with Gemini, demonstrating visual assistance and gyroscopic navigation. Demonstrations showed contextual reasoning and pragmatic suggestions during real-world queries. Samsung's Galaxy XR headset and Xreal's Project Aura received updates to improve immersive experiences and seamless transitions between wearables. Most devices will leverage Android phones and smartwatches for added functionality. Google plans two AI-glasses approaches: audio-camera-only models and display-integrated models with floating interfaces, supported by Developer Preview 3 of the Android XR platform.
Read at ZDNET
Unable to calculate read time
Collection
[
|
...
]