
"The report, based on an anonymized analysis of ChatGPT interactions and a user survey, also sheds light on some of the specific ways people are using AI to navigate the sometimes complex intricacies of healthcare. Some are prompting ChatGPT with queries regarding insurance denial appeals and possible overcharges, for example, while others are describing their symptoms, hoping to receive a diagnosis or treatment advice."
"Last spring, an analysis conducted by Harvard Business Review found that psychological therapy was the most common use of generative AI. What's most jarring about the report is the sheer scale at which users are turning to ChatGPT for medical advice. It also underscores some urgent questions about the safety of this type of AI use at a time when many millions of Americans are suddenly facing new and major healthcare-related challenges."
More than 40 million people worldwide rely on ChatGPT for daily medical advice. Five percent of messages to ChatGPT globally concern healthcare. Users ask about symptoms, seek diagnoses and treatment advice, and request help with insurance denial appeals and overcharge disputes. Generative chatbots such as ChatGPT, Google's Gemini, and Microsoft's Copilot function as confidants and companions for sensitive personal matters. Psychological therapy emerged as a common use of generative AI. The large scale of medical queries raises urgent safety concerns because chatbots can provide dangerously inaccurate information. Legal disputes over AI training data have also arisen.
Read at ZDNET
Unable to calculate read time
Collection
[
|
...
]