
"Zuckerberg: What do you think? Maybe let's make, I don't know, what should we make? Maybe like a steak sauce, maybe Korean-inspired type thing? You know, just to show what the LiveAI is like. EMPLOYEE: Yeah, let's try it. It's not something I've made before, so I could definitely use the help. Hey, Meta, start live AI. [four-second pause] LiveAI: Starting LiveAI. I love the setup you have here with soy sauce and other ingredients. How can I help?"
"LiveAI: You can make a Korean-inspired steak sauce using soy sauce, sesame oil EMPLOYEE: What do I do first? [five-second pause] EMPLOYEE: What do I do first? LiveAI: You've already combined the base ingredients, so now grate a pear to add to the sauce. [audience laughter] LiveAI: You've already combined the base ingredients, so now grate the pear and gently combine it with the base sauce. EMPLOYEE: All right, I think the wifi might be messed up. Sorry, back to you, Mark."
A stage demo of AI-powered smartglasses aimed to show LiveAI cooking assistance for a Korean-inspired steak sauce. A remote employee in a kitchen activated LiveAI and received an initial compliment about ingredient setup. LiveAI listed base ingredients and then repeatedly instructed the employee to grate a pear, with pauses between prompts. The employee and audience reacted to the lag and repetition. Network or Wi-Fi problems were cited as a likely cause. A subsequent attempt to answer an incoming video call through the smartglasses failed. The demo highlighted connectivity and reliability challenges for live assistive AI features in consumer wearables.
Read at www.mediaite.com
Unable to calculate read time
Collection
[
|
...
]