BKC Responsible AI Fellow Rumman Chowdhury comments on how the AI systems in banking are biased against marginalized communities.
“There would be a giant map on the wall of all the districts in Chicago, and they would draw red lines through all of the districts that were primarily African American, and not give them loans…Fast forward a few decades later, and you are developing algorithms to determine the riskiness of different districts and individuals. And while you may not include the data point of someone’s race, it is implicitly picked up.”
Read more in CNBC.
You might also like
- communityThe apocalypse that wasn’t