8.9 C
New York

Black college students lead movement to eliminate bias in tech

Published:

From self-driving cars that can’t detect folks with darker skin to keep from running them over, to digital assistants like Siri that have trouble understanding non-White accents, technology is biased and it is hurting Black folks.

“A lot of people will look toward technology as the end all, be all solution to a lot of social issues, but often social issues are not solved by technology, and technology often exacerbates these social issues,” said Cierra Robson, associate director of the Ida B. Wells Just Data Lab which brings students, educators, and activists together to develop creative approaches to data conception, production, and circulation.

Founded in 2018 and led by Ruha Benjamin, a sociologist and professor in the Department of African American Studies at Princeton University, the lab focuses on finding ways to “rethink and retool the relationship between stories and statistics, power and technology, data and justice.”

“Civics of technology derives from a lot of related concepts, but it’s about how we can use technology to further civic engagement, the democratic process, and social justice — especially anything that will galvanize a group of individuals to create social good,” Robson explained.

In her role at the lab, Robson works closely with Princeton students on a variety of projects that look at how technology bias is contributing to bias in all areas of our lives, from healthcare, to labor, and education.

Robson first became passionate about finding solutions to biased technology after learning about how the issue leads to violent over-policing.

“When I was an undergrad at Princeton, I had access to this entire wealth of resources that was kind of stuck in the university,” Robson said. “One of the biggest things that I wanted to do when the labs started in the summer of 2020 was figure out a way to get those resources from Princeton into the community, to people who needed them.”

And people do need this information, desperately, because biased technology is killing Black and Brown folks and contributes to higher rates of incarceration and injustice.

“Predictive policing technologies — there’s a whole bunch of them — but one of the ones I focus on a lot is that it predicts where crime is likely to happen in a given city, and that prompts police to go be deployed in those areas so that they can catch whatever crime might happen there,” Robson said. “What they base that data on is an algorithm that uses data on historic police interaction, but no one really stops to think that those historic police interactions are colored by all sorts of discriminatory processes.”

Coffistop Media
Coffistop Mediahttps://coffistop.com
Consolidated platform for African American bloggers, YouTubers, writers, foodies, travelers, athletes and much more. One platform endless flavor.

Related articles

Recent articles