
""Please do not use Google AI to find out our specials. Please go on our Facebook page or our website," the restaurant wrote in a weary Facebook post. "Google AI is not accurate and is telling people specials that do not exist which is causing angry customers yelling at our employees." "We cannot control what Google posts or says," the post added, "and we will not honor the Google AI specials.""
"AI chatbots and other large language models remain incredibly prone to hallucinating, which is the industry euphemism for generating plausible-sounding misinformation. Google's AI Overviews have been mocked for being especially wonky, including for its recommendation that you should put glue on pizza. But evidently, many remain unaware of the inherent unreliability of these tools. Eva Gannon, whose family owns the restaurant, told First Alert that Google's AI kept telling customers about deals that weren't real and even made up entire menu items."
Google's AI Overviews presented fabricated specials for a Montana restaurant, prompting owners to ask customers to consult the restaurant's Facebook page or website instead. Inaccurate AI summaries claimed nonexistent deals and invented menu items, provoking angry customers and confrontations with staff. The restaurant publicly stated it cannot control Google AI content and will not honor AI-generated specials. Large language models commonly hallucinate, generating plausible-sounding but false information; Google Overviews have been mocked for egregious errors, including advising glue on pizza. Many users remain unaware of AI unreliability, and owners reported an example where the AI claimed a large pizza was priced as a small.
Read at Futurism
Unable to calculate read time
Collection
[
|
...
]