
"Only 9% of Americans are using AI chatbots like ChatGPT or Gemini as a news source, with 2% using AI to get news often, 7% sometimes, 16% rarely, and 75% never, Pew found. Even those who do use it for news are having trouble trusting it. A third of those who use AI as a news source say it's difficult to distinguish what is true from false. The largest share of respondents, 42%, is not sure whether it's determinable."
"The report calls into question AI's role in areas it has yet to take over -- and why. Certain forms of data, especially when properly structured (or organized), are easier for AI to engage with and keep accurate, but they still tend to hallucinate, especially with text-based data like news. Unlike commonly understood facts that appear often in text data -- like a famous person's birthday, or the capital of New York -- news can contain fast-developing stories, differing opinions presented as contradictory facts, and varying article structures that make organizing data hard to standardize for a chatbot ingestin"
Most Americans do not use AI chatbots as a news source; only 9% rely on them at least sometimes (2% often, 7% sometimes, 16% rarely, 75% never). Users who consult AI for news often distrust its accuracy: one-third find it difficult to discern true from false, 42% are unsure, and half report encountering news they believe inaccurate. Younger people use AI more but also more often detect inaccuracies. AI handles structured data better than evolving text-based news, yet it still hallucinates and struggles with fast-developing stories, conflicting opinions, and varied article structures.
Read at ZDNET
Unable to calculate read time
Collection
[
|
...
]