Google responded to a small publisher whose article offered a step by step walkthrough of how big corporate publishers are manipulating the Google Reviews System Algorithm and getting away with it, demonstrating what appears to be a bias towards big brands that negatively impacts small independent publishers.
HouseFresh Google Algorithm Exposé
The story begins with a post titled, How Google is killing independent sites like ours, published on the HouseFresh website. It published what it asserted was evidence that several corporate review sites gamed Google’s algorithm by creating the perception of a hands-on reviews for what HouseFresh maintains were not actual reviews.
For example, it noted how many of the publishers ranked an expensive air purifier that HouseFresh (and Consumer Reports) reviewed and found to perform worse than less expensive alternatives, used more energy and required spending $199.98/year on purifier replacements. Yet the big brand sites gave the product glowing reviews, presumably because the high cost results in higher affiliate earnings.
Remarkably, they showed how the product photos from different big brand publishers were sourced from the same photographer in what appears to be the exact same location, strongly implying that the individual publishers themselves did not each review the product.
HouseFresh offered a detail takedown of what they insist are instances of Google showing preference to fake reviews.
This is a partial list of sites alleged by HouseFresh of successfully ranking low quality reviews:
- Better Homes & Gardens
- Real Simple
- Dotdash Meredith
- BuzzFeed
- Reddit with a spam link dropped by a user with a suspended account
- Popular Science
HouseFresh published a lucid and rational account demonstrating how Google’s Reviews Systems algorithms allegedly give big brands a pass while small independent websites publishing honest reviews steadily lose traffic under each successive wave Google’s new algorithms.
Google Responds
Google’s SearchLiaison offered a response on X (formerly Twitter) that took the accusations seriously.
Notable in the response are the following facts:
Google does not do manual checks on claims made on webpages (except as part of a reconsideration request after a manual action).
Google’s algorithms do not use phrases designed to imply a hands-on review as a ranking signal.
SearchLiaison tweeted:
“Thank you. I appreciated the thoughtfulness of the post, and the concerns and the detail in it.
I’ve passed it along to our Search team along with my thoughts that I’d like to see us do more to ensure we’re showing a better diversity of results that does include both small and large publications.
One note to an otherwise excellent write-up. The article suggests we do some type of “manual check” on claims made by pages. We do not. That reference and link is about manual reviews we do if a page has a manual *spam* action against it, and files a reconsideration request. That’s entirely different from how our automated ranking systems look to reward content.
Somewhat related, just making a claim and talking about a “rigorous testing process” and following an “E-E-A-T checklist” doesn’t guarantee a top ranking or somehow automatically cause a page to do better.
We talk about E-E-A-T because it’s a concept that aligns with how we try to rank good content. But our automated systems don’t look at a page and see a claim like “I tested this!” and think it’s better just because of that. Rather, the things we talk about with E-E-A-T are related to what people find useful in content. Doing things generally for people is what our automated systems seek to reward, using different signals.
More here: developers.google.com/search/docs/fundamentals/creating-helpful-content#eat
Thank you again for the post. I hope we’ll be doing better in the future for these types of issues.”
Does Google Show Preference To Big Brands?
I’ve been working hands-on in SEO for 25 years and there was a time in the early 2000s when Google showed bias towards big corporate brands based on the amount of PageRank the webpage contained. Google subsequently reduced the influence of PageRank scores which in turn reduced the amount of irrelevant big brand sites cluttering the search results pages (SERPs).
That wasn’t an instance of Google preferring big brands as trustworthy. It was an instance of their algorithms not working the way they intended.
It may very well be there are signals in Google’s algorithm that inadvertently favor big brands.
If I were to guess what kinds of signals are responsible I would guess that it would be signals related to user preferences. The recent Google Navboost testimony in the Google antitrust lawsuit made clear that user interactions are an important ranking-related signal.
That’s my speculation of what I think may be happening, that Google’s trust in user signals is having an inadvertent outcome, which is something I’ve been pointing out for years now (read Google’s Froot Loops Algorithm).
Read the discussion on Twitter:
What do BuzzFeed, Rolling Stone, Forbes, PopSci and Real Simple have in common?
Read the HouseFresh Article:
How Google is killing independent sites like ours
Featured Image by Shutterstock/19 STUDIO
FAQ
Does presenting a rigorous testing process in content influence Google’s ranking?
While presenting a rigorous testing process and claims of thoroughness in content is beneficial for user perception, it alone does not influence Google’s rankings. The response from Google clarifies this aspect:
- The algorithms focus on factors related to content usefulness as perceived by users, beyond just claims of in-depth testing.
- Claims of a “rigorous testing process” are not ranking signals in and of themselves.
- Content creators should focus on genuinely serving their audience’s needs and providing value, as this aligns with Google’s ranking principles.
What measures does Google take to check the accuracy of web page claims?
Google does not perform manual checks on the factual accuracy of claims made by web pages. Their algorithms focus on evaluating content quality and relevance through automated ranking systems. Google’s E-E-A-T concept is designed to align with how they rank useful content, but it does not involve any manual review unless there is a specific spam action reconsideration request. This separates factual scrutiny from automated content ranking mechanisms.