"Standing in a blue, skateshop-themed room in New York City-one of a few Meta pop-up stores across the country-I stared helplessly at the employee beside me whose instructions I could no longer hear. The glasses I tried on are the tech giant's latest attempt at "smart" eyewear, a subcategory of the internet-enabled wearable devices that entered the mainstream more than a decade ago. Powered by AI, they are operated with a second accessory called the Neural Band, a kind of fabric controller"
"But the real selling point was AI-embedded in the physical device is a more personalized version of Meta's proprietary chatbot. Theoretically, wearers can point to objects in their field of vision and ask the glasses for live context (although that feature seemed to be hindered by spotty Wi-Fi when I tried it). When I asked aloud how long I could reasonably keep a package of raw chicken in my refrigerator, an answer appeared on the lens' display: 1-2 days."
Meta's new AI-powered smart glasses pair with a Neural Band wrist controller to offer visual menus, rudimentary mapping, photo capture, live transcription, and an on-device chatbot. The Neural Band senses hand motions to navigate floating digital menus and trigger functions. Mapping and photo capture worked intermittently during demonstrations, while live-caption transcription proved finicky and slow. The on-device AI can identify objects and provide contextual answers, but those features depended on network quality and exhibited inconsistency. Early public demonstrations also revealed command-recognition failures, underscoring reliability and connectivity challenges for practical everyday use.
Read at The Atlantic
Unable to calculate read time
Collection
[
|
...
]