Google is rolling out Gemini's real-time AI video features
Briefly

Google has rolled out advanced AI features for Gemini Live, enabling real-time interactions by allowing the assistant to 'see' users' screens or camera feeds. Announced via a statement by spokesperson Alex Joseph, these capabilities were initially demonstrated through Project Astra. The rollout includes a screen-reading feature and a live video function that interprets camera feeds to assist in tasks such as color selection for pottery. This positions Google ahead in the AI assistant race against competitors like Amazon and Apple, who are still developing their upgraded capabilities.
Google has started rolling out new AI features to Gemini Live that let it "see" your screen or through your smartphone camera and answer questions about either in real-time.
The other Astra capability rolling out now is live video, which lets Gemini interpret the feed from your smartphone camera in real-time and answer questions about it.
Read at The Verge
[
|
]