These Halo smart glasses just got a major memory boost, thanks to Liquid AI
Briefly

These Halo smart glasses just got a major memory boost, thanks to Liquid AI
"MIT-born Liquid AI is a foundation model company that has developed Liquid Foundation Models (LFMs). The LFM2-VL series can take text and images of various resolutions and transform them into what the company says are "detailed, accurate, and creative description of the scenes provided by a camera sensor with millisecond latency." Through the agreement, Brilliant Labs will license both current and future multimodal Liquid foundation models (LFMs) to optimize how its AI glasses understand scenes fed to them."
"Smart glasses are often regarded as the best form factor for AI, as they can feed AI everything you see at every moment for the best assistance. However, this is only possible if the glasses can accurately interpret the visual content they are fed. The ability to accurately interpret the world around them is especially necessary with the Halo AI glasses, which feature a long-term agentic memory that can create a personalized knowledge base for the user and analyze life context for future questions."
Brilliant Labs will integrate Liquid AI's vision-language foundation models into Halo AI smart glasses. Liquid AI's LFM2-VL series accepts text and images across resolutions and returns detailed, accurate, creative descriptions of camera-provided scenes with millisecond latency. Brilliant Labs will license current and future multimodal LFMs to optimize scene interpretation in its products. Halo AI glasses include long-term agentic memory that builds a personalized knowledge base and analyzes life context for future queries. Accurate visual interpretation is essential for delivering meaningful, agentic experiences and effective real-time assistance.
Read at ZDNET
Unable to calculate read time
[
|
]