
"I recently had a patient come in, and when I recommended a medication, they had a dialogue printed out from ChatGPT that said this medication has a 45% chance of pulmonary embolism,"
"I think it's great," Dr. Bari said. "It is something that's already happening, so formalizing it so as to protect patient information and put some safeguards around it [...] is going to make it all the more powerful for patients to use."
"All of a sudden there's medical data transferring from HIPAA compliant organizations to non-HIPAA compliant vendors," Itai Schwartz, co-founder of data loss prevention firm MIND, told TechCrunch. "So I'm curious to see how the regulators would approach this."
Dr. Sina Bari encountered a patient presenting a ChatGPT printout claiming a recommended medication carried a 45% chance of pulmonary embolism. That statistic originated from a study of a niche tuberculosis subgroup and did not apply to the patient. OpenAI plans ChatGPT Health, a private chat mode that will not use user messages for model training and will allow users to upload medical records and sync Apple Health and MyFitnessPal. Security experts warn that health data flow may move from HIPAA-compliant organizations to non-HIPAA vendors, raising regulatory and data-loss concerns. Over 230 million people reportedly consult ChatGPT about health weekly.
Read at TechCrunch
Unable to calculate read time
Collection
[
|
...
]