Over the past few months, the Center for Global Women’s Health Technologies (GWHT) has been hosting bi-weekly conversations surrounding diversity and inclusion in STEM, at Duke, and within our lab.
In preparation for our latest discussion, we read two articles. The first is titled, “How a Popular Medical Device Encodes Racial Bias.” It discusses inconsistencies in the readings of a pulse oximeter between patients of different races.
“With COVID-19 death tolls already over 160,000 in the United States alone and rising daily, the pulse ox is a vital tool for survival. It should not work least accurately for those whose health is most in danger.”
The pulse oximeter (pulse ox) is used to read blood oxygen levels. Source: QuartzThe second article we read, “Racial bias skews algorithms widely used to guide care from heart surgery to birth, study finds,” discusses how widely-used healthcare algorithms commonly result in Black patients not receiving the same medical attention as their white peers.
“When patients come to an emergency department with pain in the back or side, doctors use an algorithm with a 13-point scale to assess whether the cause is kidney stones. A higher score means less likelihood of that. Being Black automatically adds three points to the score. An assessment of the algorithm by independent researchers found no scientific support for the assumption that Black people’s pain is less likely to indicate kidney stones.”
These articles inspired us to ask ourselves some tough questions about our own technologies. We have included some quotes from our discussion surrounding these articles. It is our hope that these questions can be helpful to you and your lab as well.
Where do you think racial biases in healthcare algorithms/devices come from? “I think the medical field and scientists feel like they already know a lot, and that can make them less receptive to these facts. The more we learn, the more we need to be sure that we are exercising humility. Also, I think that if we don’t put the money and time into even what may seem like the “little things” with racial disparities, how will we ever tackle the larger issues?”
Can you think of anywhere in our lab’s research (i.e devices, algorithms, etc.) where there might be opportunity for racial bias to exist? “The Callascope’s functionality is sensitive to body types and sizes. Its psychological effects are sensitive to gender expression. The cervix algorithm’s images are collected from specific populations and interpreted by doctors who may be used to viewing cervixes from only certain populations. I think we should continue to analyze the results with this in mind in order to catch biases and also continue to do our best in diversifying data sources.”
How do you think we could begin going about avoiding these biases in our work? “I think the starting place is having these conversations. Being aware of the mistakes others have made is the first step in not repeating them. In the lab, we talk a lot about human centered design, and I think that it’s natural to incorporate racial equity into that type of thinking.
We begin by asking ourselves if the data we collect is racially representative, if the questions we are asking are culturally sensitive, and if our designs make assumptions that might be harmful to a group of people (i.e. the pulse ox). After that, we focus on interpreting and communicating that data in a way that is racially equitable. One suggestion to keep racial bias out of our data is consulting with race experts in the humanities.”
In our previous discussions, we were mostly discussing racism in STEM in a broad context. This week, we focused the conversation on our own research and device/algorithm development, and the steps we can take to avoid furthering racial disparities in our work. We want to hear about how your labs are addressing racial biases in technology. Leave a comment or email us at email@example.com.