
"One example of a biased device is the pulse oximeter. This device measures blood oxygen by using light. You have probably had one clipped on your finger during a visit to your doctor. Or you might even own one. The bias in this device is that it is three times more likely to not reveal low oxygen levels in dark skinned patients than light skinned patients."
"In most cases, these can be addressed by improving the sensors or developing alternative devices. The problem is, to exaggerate a bit, is that most medical technology is made by white men for white men. This is not to claim such biased devices are all cases of intentional racism and misogyny. There is not, one assumes, a conspiracy against women and people of color in this area but there is a bias problem."
"Many medical devices use software, and it is often used in medical diagnosis. People are often inclined to think software is unbiased, perhaps because of science fiction tropes about objective and unfeeling machines. While it is true that our current software does not feel or think, bias can make its way into the code. For example, software used to analyze chest x-rays would work less well on women than men if the software was "trained" only on X-rays of men."
Medical devices can be biased in accuracy and effectiveness, producing worse outcomes for certain groups. Hardware can have sensor biases, as with pulse oximeters that miss low oxygen readings in darker-skinned patients about three times more often. Many devices were developed without diverse populations in mind, contributing to unequal performance across sex and race without implying intentional malice. Software in diagnostic devices can inherit bias from nonrepresentative training data, causing poorer results for underrepresented groups. Practical remedies include improving sensor design, creating alternative devices, and using diverse training datasets to reduce algorithmic and sensor-driven disparities.
Read at A Philosopher's Blog
Unable to calculate read time
Collection
[
|
...
]