Meta's smart glasses can now describe what you're seeing in more detail
Briefly

Meta has introduced new features for its Ray-Ban smart glasses aimed at enhancing accessibility for blind and low vision users, coinciding with Global Accessibility Awareness Day. The AI can now be customized for more detailed environmental descriptions, helping users understand their surroundings better. Additionally, the 'Call a Volunteer' feature connects users with over 8 million sighted volunteers for real-time assistance with tasks. Currently rolling out in the US and Canada, Meta plans to expand these features to other markets in the future.
Meta AI's new features for Ray-Ban smart glasses enhance accessibility, providing detailed environmental descriptions and connecting blind users to a large volunteer network.
The new customizable AI responses will assist users in understanding their environment better, an important development for blind and low-vision communities.
By linking users to over 8 million sighted volunteers, the glasses can aid in various tasks, demonstrating Meta's commitment to inclusive technology.
The Call a Volunteer feature, available in all Meta AI-supported countries, allows users to receive immediate help for everyday challenges.
Read at The Verge
[
|
]