Linked: Of course technology perpetuates racism. It was designed that way.
Charlton makes some interesting observations, and draws on some history to make this point, but it’s one we all need to keep in mind when we think of how we might be relying on AI to help us solve societal issues:
We often call on technology to help solve problems. But when society defines, frames, and represents people of color as “the problem,” those solutions often do more harm than good. We’ve designed facial recognition technologies that target criminal suspects on the basis of skin color. We’ve trained automated risk profiling systems that disproportionately identify Latinx people as illegal immigrants. We’ve devised credit scoring algorithms that disproportionately identify black people as risks and prevent them from buying homes, getting loans, or finding jobs.
It’s hard to think about AI being racist, or even the people building the tools as being racist. They don’t need to be. There’s enough out there that we can bring it into the algorithms very easily. Think again about credit, like housing loans for example. The AI that helps predicts who will pay back a load versus who defaults is gong to look at a lot of information, things like the neighborhood, family history of home ownership, criminal records, etc. Well guess which neighborhoods are going to have more poor people living in them, and thus people with less money to pay off their houses? Yup, neighborhoods with a lot of minorities, because historically, they’ve been forced into those neighborhoods. Does that mean that any individual family isn’t a good loan candidate? Of course not, but the algorithms say what the algorithms say.
The question is can we develop a predictive model around data that doesn’t already include all the ways in which we’ve already discriminated against certain groups? If we aren’t careful, we’ll develop tools that take the individual racism out of our decisions, and just literally bake it into the cake. We don’t want that. No AI would be better than that.