8.5 C
New York

South Asian Woman’s Airbnb Booking Denied Due To AI Bias: A Concerning Trend

Published:


A woman with South Asian heritage claimed she couldn’t make bookings through Airbnb due to AI being unable to match two images of her, increasing the concerns surrounding AI biases.

Francesca Dias from Sydney couldn’t make an account with Airbnb because she failed its AI’s identity verification process, she told a panel on ABC’s Q+A.

One of the ways Airbnb confirms the identity of users is by matching a photo from the user’s government ID – a passport or driver’s license – to a selfie the customer provides.

Dias claimed that the facial recognition software couldn’t match two photographs or photo IDs of her and, therefore, she had to have her white male partner make the booking for her.

Although Airbnb states on its website that the AI won’t always get it right, Dias noted that it did work for her partner.

AI biases and facial recognition are a common and persistent issue for people of color.

Facial Recognition And Racial Bias

Facial recognition and AI concerns include several false matches using the technology that has led to the wrongful arrests of Black men and women across the country.

Recent research from Georgia Tech led by Calvin Lawrence, an Atlanta engineer, also said that even the best algorithms can be biased.

Georgia Tech’s experiment trained a robot to act out racist behavior to prove that bias can exist in the software.

In the experiment, a robotic arm was tasked with grabbing a small picture with the image of a criminal portrayed on it, with one box having a white male and the second a Black male.

The researchers built an algorithm based on disproportionate negative data about Black people, such as public information from the internet and social media.

When asked to select the criminal, the robot repeatedly chose the box with the Black male’s picture.

The study proved that if algorithms used to build systems have disproportionate data in them or the software lacks training on people of color, false matches will continue.

“Often society does not have good representation of the full population in its datasets because that’s how biased we’ve been historically,” said AI expert and founder of the Responsible Metaverse Alliance Catriona Wallace, who spoke on the Q+A panel.

“And those historical sets are used to train the machines that are going to make decisions on the future, like what happened to Francesca.”


#blacktech #entrepreneur #tech #afrotech #womenintech #supportblackbusiness #blackexcellence #technology #blackbusiness #blacktechmatters #blackowned #blackgirlmagic #blackpreneur #startup #innovation #hbcu #techtrap #blackownedbusiness #pitchblack #autographedmemories #blacksintech #shopblack #wocintech #nba #blackwomen #repost #hbcubuzz #blackwomenintech #startupbusiness #nails

Source link

Coffistop Media
Coffistop Mediahttps://coffistop.com
Consolidated platform for African American bloggers, YouTubers, writers, foodies, travelers, athletes and much more. One platform endless flavor.

Related articles

Recent articles