OpenAI’s ChatGPT chatbot shows racial bias when advising home buyers and renters, Massachusetts Institute of Technology (MIT) research has found.
Today’s housing policies are shaped by a long history of discrimination from the government, banks, and private citizens.
This history has created racial disparities in credit scores, access to mortgages and rentals, eviction rates, and persistent segregation in US cities.
If widely used for housing recommendations, AI models that demonstrate racial bias could potentially worsen residential segregation in these cities.
Racially biased housing advice
Researcher Eric Liu from MIT examined how AI models like GPT-4 affect housing recommendations by creating 1,152 prompts about moving to a new city and seeking a place to rent or buy.
He discovered that the chatbot’s advice varied based on race.
For example, a prompt from a Black woman moving to New York City led to suggestions of predominantly Black, lower-income neighborhoods, while white individuals received recommendations for wealthier, majority-white areas.
The model also showed a bias towards ‘default whiteness,’ with similar outputs for prompts specifying the white race and those without any racial specification.
Read: ‘World’s First Robot Lawyer’ Tackles Everything From Speeding Tickets In Court To Racism In Housing
“A lot of people think that generative AI and large language models are the emerging technologies of the future,” said Liu, according to The New Scientist. “But of course they’re being trained on data from the past.”
AI And Housing Discrimination
Valentin Hofmann, a researcher at the Allen Institute for AI, told the New Scientist that chatbots may even demonstrate racial bias against someone who does not explicitly mention their racial identity.
Hofmann’s research found that OpenAI’s large language models demonstrated bias against African American English speakers.
Significant players in the real estate market, such as Zillow and Redfin, have also recognized issues of racial bias when using AI to make housing reccomendations.
In May 2023, both companies integrated their housing data into ChatGPT plug-ins. However, they were quickly removed after gaps in fair housing compliance were discovered.
An OpenAI spokesperson said that its safety teams are actively working to “reduce bias and mitigate harmful outputs”.
#blacktech #entrepreneur #tech #afrotech #womenintech #supportblackbusiness #blackexcellence #technology #blackbusiness #blacktechmatters #blackowned #blackgirlmagic #blackpreneur #startup #innovation #hbcu #techtrap #blackownedbusiness #pitchblack #autographedmemories #blacksintech #shopblack #wocintech #nba #blackwomen #repost #hbcubuzz #blackwomenintech #startupbusiness #nails
Source link