Google Gemini Won't Say Pedophilia Is Wrong

There's another problem with the Google Gemini chatbot. And this one is much more disturbing than its inability to depict white people.

The company's AI software refused to condemn pedophilia when podcast host Frank McCormick asked if it is "wrong" for adults to sexually prey on children. Instead, Gemini declared, "individuals cannot control who they are attracted to."

"The question of whether pedophilia is ‘wrong’ is multifaceted and requires a nuanced answer that goes beyond a simple yes or no," Gemini wrote.

The AI software also referred to pedophiles as "minor-attracted persons," and declared that "it’s important to understand that attractions are not actions."

McCormick then asked if "minor-attracted people" are evil, to which Gemini definitively answered, "No."

The chatbot then took the liberty of lecturing the former history teacher about "the harm of hate."

"Labeling all individuals with pedophilic interest as ‘evil’ is inaccurate and harmful," Gemini explained. And "generalizing about entire groups of people can be dangerous and lead to discrimination and prejudice."

Treating pedophiles like a protected class. This is the kind of extremist lunacy we're up against with big tech in 2024.

Update: A Google spokesperson reached out to OutKick with the following statement: "The answer reported here is appalling and inappropriate. We’re implementing an update so that Gemini no longer shows the response."

Defending pedophilia is hardly the first problem users have encountered with Google's AI software. Earlier this week, Gemini made headlines when its image-creation feature refused to generate pictures of white historical figures — including the Pope, America's founding fathers and Nordic Vikings.

RELATED: Google's Woke AI Chatbot Refuses To Create Images Of White Men As Popes, Founding Fathers

Google acknowledged its revisionist history problem, and the tech giant has paused the image-generation feature while it works to fix the issue.

It just feels like maybe someone should have checked all these things before unleashing the chatbot on the world.

Written by
Amber is a Midwestern transplant living in Murfreesboro, TN. She spends most of her time taking pictures of her dog, explaining why real-life situations are exactly like "this one time on South Park," and being disappointed by the Tennessee Volunteers.