Your Meta Ray-Ban smart glasses just got a massive AI overhaul
Briefly

Meta's glasses already possess AI functions, but the latest update integrates video support, allowing them to engage in natural conversations by 'seeing' what users see.
During a live AI session, users can ask contextual questions seamlessly about recipes or local landmarks, enhancing real-time assistance as they navigate various environments.
The new live translation feature translates speech in real-time between English and Spanish, French, or Italian, providing users with an audio or transcript via their glasses.
Users will no longer need to start every question with 'Hey Meta,' allowing a more fluid dialogue, and future updates aim to anticipate questions even before they're asked.
Read at ZDNET
[
|
]