Google already knows how to make AI in Search helpful, but it's not with AI Overviews
Briefly

Generative AI in Search can lead to misinformation as it might confidently provide incorrect facts. Examples include advising to use glue on pizza, cooking chicken to 102-degrees F, mentioning 'blinker fluid,' adding oil to put out a fire, and saying no African country starts with 'K.'
AI's ability to quickly summarize information in Search seems logical amid SEO spam. However, AI struggles to differentiate truth from falsehood. It's noted that AI lacks the proficiency to ascertain factuality.
Read at 9to5Google
[
add
]
[
|
|
]