AI detects cancer but it's also reading who you are
Briefly

AI detects cancer but it's also reading who you are
"A pathologist studies an extremely thin slice of human tissue under a microscope, searching for visual signs that reveal whether cancer is present and, if so, what type and stage it has reached. To a trained specialist, examining a pink, swirling tissue sample dotted with purple cells is like grading a test without a name on it -- the slide contains vital information about the disease, but it offers no clues about who the patient is."
"That assumption does not fully apply to artificial intelligence systems now entering pathology labs. A new study led by researchers at Harvard Medical School shows that pathology AI models can infer demographic details directly from tissue slides. This unexpected ability can introduce bias into cancer diagnosis across different patient groups. After evaluating several widely used AI models designed to identify cancer, the researchers found that these systems did not perform equally for all patients. Diagnostic accuracy varied based on patients' self-reported race, gender, and age."
Pathology relies on microscopic examination of thin tissue slices to determine cancer presence, type, and stage. To a trained specialist, slides convey disease information but not patient identity. Artificial intelligence models applied to pathology slides can infer demographic details directly from tissue images, creating hidden biases. Several widely used diagnostic models showed unequal accuracy across patients' self-reported race, gender, and age. Three key reasons for these disparities were identified, and a smarter training approach substantially reduced performance differences. Routine bias evaluation of medical AI is necessary to promote fair, reliable cancer care for all patients.
Read at ScienceDaily
Unable to calculate read time
[
|
]