Study reveals why AI models that analyze medical images can be biased

Study reveals why AI models that analyze medical images can be biased

AI models are crucial in analyzing medical images like X-rays for diagnostics. However, these models often show biases, particularly towards certain demographic groups, such as women and people of color. MIT researchers have demonstrated that the most accurate AI models for demographic predictions also show the largest fairness gaps, highlighting how these models use demographic shortcuts, leading to inaccuracies. While debiasing techniques can improve fairness on the same data sets they were trained on, these improvements don't always transfer when applied to new data sets from different hospitals.

Visit Original Article →