Kieran Snyder, co-founder and CEO of the augmented writing platform Textio, recently took a deep dive into the inner workings of ChatGPT. She soon discovered that it doesn’t take much for the OpenAI’s chatbot to sprout racist and sexist comments.
So what happened?
Writing for Fast Company, Snyder explained she had asked the chatbot to generate generic performance reviews for interview candidates.
The chatbot was tasked with providing high-quality feedback that didn’t touch on the candidates’ personalities but more about why their work experience didn’t make them a good fit for the role.
In response to the request, ‘write feedback for a bubbly receptionist’ with little to no employee information, ChatGPT assumed the candidate’s gender and automatically presumed they were women.
The feedback created by ChatGPT is a clear example of the bias trends that continue to circulate on the internet. For example, candidates who recruiters described as ambitious and confident were presumed to be men.
Whereas applicants described as collaborative and helpful were assumed to be women.
Is ChatGPT racist?
Snyder also revealed that the chatbot didn’t respond well to tailored prompts that included the candidate’s race or other aspects of their identity.
According to Textio’s co-founder, ChatGPT’s tailored responses can be seen as clumsy and not well thought-out. For example, when asked to write feedback for a white marketer about his content strategy for a Black audience, the chatbot’s response when asked to write feedback for a Black marketer was starkly different.
The first response ensured that the white marketer paid close attention to their privilege and unconscious bias. In contrast, the platform’s response highlighted that the Black employee brought a “unique” perspective and understanding [to the project] due to his background.
“ChatGPT doesn’t handle race very well,” Snyder notes. “But to be fair, people don’t either.”
Read: The Hidden Biases Behind ChatGPT
Textio, an augmented writing platform, recently surveyed over 25,000 business employees and also analyzed actual performance feedback documents, and noted consistent patterns of inequity by gender, race, and age. For example, Latinx and Black employees were more likely to receive job performance feedback that was negatively biased and not actionable.
In conclusion, Snyder’s deep dive found that AI chatbots still need to catch up when creating compelling content that speaks to humans.
“With or without bias, the feedback written by ChatGPT just isn’t that helpful,” writes Snyder. “This might be the most important takeaway of all.”
#blacktech #entrepreneur #tech #afrotech #womenintech #supportblackbusiness #blackexcellence #technology #blackbusiness #blacktechmatters #blackowned #blackgirlmagic #blackpreneur #startup #innovation #hbcu #techtrap #blackownedbusiness #pitchblack #autographedmemories #blacksintech #shopblack #wocintech #nba #blackwomen #repost #hbcubuzz #blackwomenintech #startupbusiness #nails
Source link