An SEO crafting a newsletter with AI spotted a hallucination about a March 2026 Google Core Update and decided to publish it as an experiment to see how misinformation spreads. While search marketing industry publications ignored the fake news some independent SEOs picked it up and ran with it without first checking the factual accuracy of the news.

Mistake Leads To A Double Take

The person who did the experiment, Jon Goodey (LinkedIn profile), published a LinkedIn article that purposely contained an AI hallucination about a non-existent March 2026 Google Core update. He explained, in a subsequent Linkedin post, that his AI workflow contains human quality control to catch AI mistakes and when he spotted it he decided to go ahead and publish it to see if anyone would dispute or challenge the false information.

Google Ranks Misinformation

Goodey explained that it was Google itself that fueled the misinformation about the fake core algorithm update as his LinkedIn newsletter ranked for the phrase Google March Update 2026. The fake news ranked in Google’s classic search and in AI Overviews.

He explained:

“My LinkedIn article began ranking on the first page of Google for “Google March update 2026.” Not buried on page three. Right there, visible to anyone searching for information about recent Google algorithm changes.

…Google’s own AI Overview feature picked up the fabricated information and presented it as fact.”

Google’s fact checking in the search results is basically non-existent, so it’s not surprising that Google’s search engine would rank the fake information, especially for anything related to SEO. Using Google for SEO queries is like playing a slot machine, you have no idea if the information will be right or a total fabrication.

Searching for information about a dubious black hat tactic (like Google stacking) may cause Google to actually validate it, potentially misleading an honest business person who wouldn’t know better.

Screenshot Of Google Recommending A Black Hat SEO Tactic

This is a longstanding black spot on Google’s search results and is why it’s not surprising to see Google spew out misinformation about a fake Google update.

Websites Echo Misinformation

The result is that SEO websites began repeating the false update information because of course, Google core updates are a traffic magnet and a way some SEOs attract potential clients. There’s a long history in the SEO community of stirring up noise about non-existent updates, so again, not surprising to see SEO agencies pick up this ball and run with it.

Goodey shared:

“Multiple websites published detailed, authoritative-sounding articles about the “March 2026 Core Update,” treating it as confirmed fact. These weren’t throwaway blog posts. They were detailed pieces with specific claims about Gemini 4.0 Semantic Filters, Information Gain metrics, and recovery strategies.”

Most News Sites Ignored The Fake Update

SEJ and our competitors ignored the fake March update news. But a technology site apparently did not, with Goodey calling them out about it.

He wrote:

“Another site, TechBytes, went even further with a piece by Dillip Chowdary headlined “Google March 2026 Core Update: Cracking Down on ‘Agentic Slop’.” (Oh, the irony…).

This article invented specific technical details including claims about a “Gemini 4.0 Semantic Filter,” a “Zero Information Gain” classification system, and a “Discover 2.0 Engine” prioritising long-form technical narratives.”

Google Has A Policy About Fact Checking

I recall Google’s Danny Sullivan talking about how Google doesn’t do fact checking but I couldn’t find his tweet or statement. There is however a news report published in Axios related to fact checking where a Google spokesperson affirms that Google will not abide by an EU law that requires fact checking.

According to the news article:

“In a letter written to Renate Nikolay, the deputy director general under the content and technology arm at the European Commission, Google’s global affairs president Kent Walker said the fact-checking integration required by the Commission’s new Disinformation Code of Practice “simply isn’t appropriate or effective for our services” and said Google won’t commit to it.

The code would require Google to incorporate fact-check results alongside Google’s search results and YouTube videos. It would also force Google to build fact-checking into its ranking systems and algorithms.

Walker said Google’s current approach to content moderation works and pointed to successful content moderation during last year’s “unprecedented cycle of global elections” as proof.
He said a new feature added to YouTube last year that enables some users to add contextual notes to videos “has significant potential.” (That program is similar to X’s Community Notes feature, as well as new program announced by Meta last week.)”

Takeaways

Jon Goodey had multiple takeaways, with the most important one being that people should fact check what they read online.

Other takeaways are:

  • AI workflows should have validations built into them.
  • Most readers don’t fact check (only a few commenters disputed the false claims).
  • AI overviews and search amplify misinformation.
  • One article is echoed by the Internet, with other sites repeating and embellishing on the original false information.

Featured Image by Shutterstock/Rawpixel.com



Source link

Avatar photo

By Rose Milev

I always want to learn something new. SEO is my passion.

Leave a Reply

Your email address will not be published. Required fields are marked *