A neuroscientist writer, known as "Neuroskeptic," uncovered that Google's AI search was presenting a completely fabricated medical condition—referred to as "Kyloren Syndrome"—as a scientific fact. This revelation immediately raised public concerns about the authoritative nature that AI systems might possess when disseminating misinformation.
Seven years prior, as a means to expose flaws within scientific publications, Neuroskeptic humorously invented the fictitious "Kyloren Syndrome." Surprisingly, Google's AI not only referenced this non-existent condition but also supplied detailed medical data, explaining how this imaginary syndrome could be transmitted from mother to child via mitochondrial DNA mutations. All of this information was entirely fabricated.
Neuroskeptic commented, "To be honest, had I known this would contaminate AI databases, I would have reconsidered pulling this prank. However, that was back in 2017, and I believed it was merely an amusing way to highlight the issue."
AI Summaries Cited Sources Without Genuine Comprehension
Even though "Kyloren Syndrome" isn't a widely searched term, this incident exposes a troubling issue with AI search tools: they frequently present inaccurate information with unwavering confidence. While standard Google searches could instantly identify the satirical nature of the article, the AI overlooked this clear warning sign. This underscores the critical role of context in AI comprehension.
Notably, the Google AI model Gemini, responsible for generating these search summaries, entirely disregarded this essential context, even when referencing the article that could have exposed the prank. Although the AI cited sources, it did not truly grasp their content.
However, not all AI search tools were deceived by this fake condition. For instance, Perplexity entirely avoided referencing the spoof article, instead opting to discuss potential psychological issues of the "Star Wars" character Kylo Ren. Moreover, ChatGPT demonstrated a more discerning search performance by indicating that "Kyloren Syndrome" appeared in a satirical piece titled "Mitochondria: Structure, Function, and Clinical Significance."
AI Search Companies Remain Silent on Error Rates
This incident involving Google has heightened public concerns regarding the potential for AI search services to propagate misinformation. When questioned about the specific error rates of AI search results, companies like Google, Perplexity, OpenAI, and Microsoft opted to remain silent. They did not even confirm whether they systematically monitor these errors, although doing so could aid users in better understanding the limitations of the technology.