Nkonde said these language programs — through the data they consume — are reflecting society as it has been, with all its racism, biases and stereotypes.
“The way many of these systems are developed is they’re only looking at pre-existing data. They’re not looking at who we want to be … our best selves,” she said.
Can we program AI to be something that we aren’t currently though? The only way to do that would be to decide what to feed it and what not to, which by it’s very nature includes the bias of the people making that decision.
AI is not without bias, and maybe the best thing we can do is know that going in, instead of assuming that technology would solve this problem.
Follow these topics: Tech