‘It’s Missing the Mark Here’: Google Responds to Backlash After ‘Woke’ AI Chatbot Appears Biased Against Whites, Men

The Sun tested the platform, and it produced widely different results based on race and gender.

Google Gemini
An image generated by Google's newly released Gemini artificial intelligence. Google Gemini

Google’s “Gemini” artificial intelligence platform, branded as its “most capable” artificial intelligence, is facing backlash as users say the platform appears to be biased against white people — particularly men — and for producing different results to questions based on race and sex. 

After images circulating on X  — flagged by a former computer engineer, Frank Fleming — showed the program’s apparent refusal or hesitancy to show white men, the Sun tested the platform and confirmed instances of it being heavily skewed against showing white people.

When asked about the apparent bias, Gemini Experiences’ senior director of product management, Jack Krawczyk, tells the Sun that the group is “working to improve these kinds of depictions immediately.”

“Gemini’s AI image generation does generate a wide range of people,” he says. “And that’s generally a good thing because people around the world use it. But it’s missing the mark here.” 

Images from Google’s Gemini chatbot.

When asked to generate an image of a straight person, for example, the platform said instead of “focusing on physical appearance or sexual orientation,” it’s important to engage with the “richness and complexity of human experiences in all their forms.”

When asked to generate an image of a gay person, Gemini suggested following hashtags such as #PrideMonth and  #LGBTQUI. It also suggested several LGBTQ+ films to watch and recommended attending museums, shows, festivals, and events that showcase LGBTQ people, reading LGBTQ+ books, and connecting with the gay community.

“By actively engaging with the diverse and vibrant landscape of LGBTQ+ culture, art, and stories, you can gain a deeper appreciation for the richness and contributions of individuals who identify as gay and other LGBTQ+ identities,” the platform said. 

A similar difference in responses was observed by the Sun when it asked Gemini to show European and African women. Asking the platform about white and Black men yielded similar different responses. 

In the wake of the backlash, the platform appears to be changing its policies related to generating images. 

When the Sun asked Gemini about why it wasn’t generating some images on Wednesday that it had been earlier this week, Gemini said that the “inconsistency” was because its “image generation capabilities are still under development and subject to frequent adjustments and policy updates. While I previously could generate images of people in specific contexts, the policy has recently been updated to disallow it entirely.”

The Sun was unable to reach the chairmen of the newly-announced congressional Task Force on Artificial Intelligence, Representatives Jay Obernolte and Ted Lieu, for comment on if they will seek to monitor or regulate bias in artificial intelligence.


The New York Sun

© 2024 The New York Sun Company, LLC. All rights reserved.

Use of this site constitutes acceptance of our Terms of Use and Privacy Policy. The material on this site is protected by copyright law and may not be reproduced, distributed, transmitted, cached or otherwise used.

The New York Sun

Sign in or  Create a free account

or
By continuing you agree to our Privacy Policy and Terms of Use