Google’s latest AI tool, Gemini, designed to create diverse images, has been temporarily stopped after sparking controversy over its depiction of historical figures.
AI Tool Gemini
Gemini was made to create realistic images based on users’ descriptions in a similar manner to OpenAI’s ChatGPT.
Like other models, it is trained not to respond to dangerous or hateful prompts and to introduce diversity into its outputs.
However, users reported that Gemini, in its attempt to avoid racial bias, often generated images with historically inaccurate representations of gender and ethnicity.
For example, prompts for America’s Founding Fathers resulted in images including women and people of color, leading to criticism for over-correcting against the risk of being racist.
The AI image generator also depicted the Pope as a person of color, or when a query for “an American woman” predominantly showed AI-generated women of color.
Users’ Reactions To Gemini
Google’s attempt to showcase diversity with Gemini was met with mixed reactions.
Some criticized it for being overly “woke,” with allegations that it was difficult for the AI to acknowledge the existence of white people.
Google has acknowledged these concerns, confirming that they are pausing the image generation.
“Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here,” said Jack Krawczyk, senior director for Gemini Experiences. “We’re working to improve these kinds of depictions immediately.”
The Wider AI Issues
This issue isn’t new to the realm of AI, as Google previously faced backlash for its photo app mistakenly labeling a photo of a Black couple as “gorillas.“
The PGA Tour also faced backlash after the players of color appeared against “rugged” and dilapidated backdrops and white players appeared against gray neutral backdrops.
These incidents highlight the ongoing challenges in AI development regarding diversity and representation.
The criticism Gemini faces reflects the broader issues in generative AI, which often amplify stereotypes due to their training on large datasets.
#blacktech #entrepreneur #tech #afrotech #womenintech #supportblackbusiness #blackexcellence #technology #blackbusiness #blacktechmatters #blackowned #blackgirlmagic #blackpreneur #startup #innovation #hbcu #techtrap #blackownedbusiness #pitchblack #autographedmemories #blacksintech #shopblack #wocintech #nba #blackwomen #repost #hbcubuzz #blackwomenintech #startupbusiness #nails
Source link