Natural language processing

Google’s AI Overview confidently presents fake ‘Kyloren syndrome’ as real medical condition

Google’s AI Overview confidently presents fake ‘Kyloren syndrome’ as real medical condition



summary
Summary

A neuroscience writer caught Google’s AI search presenting a completely fabricated medical condition as scientific fact. The incident raises questions about AI systems spreading misinformation while sounding authoritative.

The user “Neuroskeptic” discovered Google’s AI describing “Kyloren syndrome” as a real medical condition. The twist? Neuroskeptic had invented this fake syndrome seven years ago as a joke to expose flaws in scientific publishing.

The AI didn’t just mention the condition – it provided detailed medical information, describing how this non-existent syndrome passes from mothers to children through mitochondrial DNA mutations. All of this information was completely made up.

Screenshot of a tweet and Google search result for the fictitious “Kyloren Syndrome” disease, presented as an AI-generated fact.
A fictional disease invented as a joke is considered a real medical condition by Google’s AI Overview. | Image: Neuroskeptic via Bluesky

“I’d honestly have thought twice about doing the hoax if I’d known I might be contaminating AI databases, but this was 2017. I thought it would just be a fun way to highlight a problem,” Neuroskeptic said.

Ad

AI overview cites sources it hasn’t read

While “Kyloren Syndrome” is an unusual search term, this case reveals a concerning pattern with AI search tools: they often present incorrect information with complete confidence. A regular Google search immediately shows the paper was satirical, yet the AI missed this obvious red flag. It shows that context often matters.

Academic paper title page showing template format with placeholder authors and Mountainview University affiliation.
Image: Screenshot THE DECODER

But Google’s AI model Gemini, which creates these search overviews, completely missed this crucial context – despite citing the very paper that would have exposed the joke. The AI referenced the source without actually understanding what it contained.

Google search result showing PDF preview of NMJS paper mentioning Kyloren syndrome alongside real medical conditions.
Image: Screenshot THE DECODER
Dark mode screenshot of social media comment section discussing mitochondrial DNA with like/dislike buttons visible. The NMJS paper is visible as the source on the right side of the screen.
Image: Screenshot THE DECODER

In fairness, not all AI search tools fell for the fake condition. Perplexity avoided citing the bogus paper entirely, though it did veer off into a discussion about Star Wars character Kylo Ren’s potential psychological issues.

ChatGPT’s search proved more discerning, noting that “Kyloren syndrome” appears “in a satirical context within a parody article titled ‘Mitochondria: Structure, Function and Clinical Relevance.'”

AI search companies stay quiet about error rates

Google’s incident adds to concerns about AI search services making things up while sounding authoritative. When asked about concrete error rates in their AI search results, Google, Perplexity, OpenAI, and Microsoft have all stayed silent. They didn’t even confirm whether they systematically track these errors, even though doing so would help users understand the technology’s limitations.

Recommendation

This lack of transparency creates a real problem. Users won’t spend time fact-checking every AI response – that would defeat the purpose. If people have to double-check everything, they might as well use regular search, which is often more reliable and faster. But this reality doesn’t square with some people’s claims that AI-powered search is the future of the Web.

The incident also raises questions about who’s responsible when AI systems spread misinformation that could potentially harm people, as happened recently with Microsoft Copilot talking about court reporter Martin Bernklau. So far, companies running these AI systems haven’t addressed these concerns.

Google's AI Overview confidently presents fake 'Kyloren syndrome' as real medical condition

Source link