This is an excerpt from SEJ’s Ranking Factors 2023 ebook with changes and updates to bring it up to date. SEO changes quickly!
Ranking factors are getting more difficult to fully categorize.
Today, Google uses the terms “systems” and “signals” more than “ranking factors.”
Google says, about how it ranks results:
“Google uses automated ranking systems that look at many factors and signals about hundreds of billions of web pages and other content in our Search index to present the most relevant, useful results, all in a fraction of a second.”
There are multiple ranking systems, and they all make use of different combinations of signals.
Google is (and has been for some time) shifting away from a model where a collection of quantitative factors determines ranking.
Instead, Google is building collections of qualitative signals that come together to approximate bigger – human – questions and decisions, such as:
Many SEO professionals are numbers people. Researchers. Data divers. Google releases a little bit of information about its algorithms, and we cling like limpets.
For many years, some have even attempted to use clues from interpreting patents to try and decipher the algorithmic impact of everything from social media to co-citation.
But Google patents aren’t the Constitution.
No ultimate document holds the secrets to the ranking algorithms – though I’d love to see a heist movie about stealing it from Google HQ. (We all know Nicholas Cage would take part.)
Interpreting patents is a good skill and can provide important insights.
But you should weigh the business impact of obsessing over individual elements against leaning into understanding your audience.
As algorithms get more complex and AI becomes more advanced, it’s only going to become more difficult to pinpoint the exact sources of data they use to make decisions.
Ranking factors aren’t going away; they’re evolving.
The cornerstones of ranking will always be there, but the more complexity gets added to the systems, the less it benefits us to interrogate every potential signal.
What The Heck Happened With “Page Experience” & What’s A Ranking System?
In April 2023, Google moved several entries from its “ranking systems” documentation and placed them elsewhere:
- Page experience.
- Mobile-friendliness.
- Page speed.
- Security and HTTPs.
Several SEO pros lost their collective cool over this change.
Google’s Search Liaison account on X (formerly Twitter) shared this statement:
“Our guidance on page experience is here, as we shared last week along with our blog post:
https://developers.google.com/search/docs/appearance/page-experience
It does *not* say page experience is somehow ‘retired’ or that people should ignore things like Core Web Vitals or being mobile-friendly. The opposite. It says if you want to be successful with the core ranking systems of Google Search, consider these and other aspects of page experience.
We also made an update to our page on ranking systems last week. Ranking *systems* are different than ranking *signals* (systems typically make use of signals). We had some things listed on that page relating to page experience as “systems” that were actually signals. They shouldn’t have been on the page about systems.
Taking them off didn’t mean we no longer consider aspects of page experience. It just meant these weren’t ranking *systems* but instead signals used by other systems.
…
The big takeaway? As our guidance on page experience says in the first sentence:
‘Google’s core ranking systems look to reward content that provides a good page experience.’ … ”
This seems to mean that the changes were a matter of organization and not any functional algorithm adjustment.
A ranking system is a broad application of signals that go toward a specific goal or evaluation.
Ranking systems can use ranking signals, but not necessarily all the time or for every query.
“Page experience” is not a ranking system.
However, it is a collection of ranking signals that multiple ranking systems can and do use to evaluate and reward pages with good user experience.
Click Data – The Antitrust Lawsuit & CTR As A Ranking Factor
A software engineer who left Google in November 2022 was called to give testimony during the antitrust suit against Google.
I started seeing chatter all over social media about his smoking gun statement on click data in ranking.
His testimony called attention to the probability that Google uses clicks and other data about interactions on SERPs in ranking algorithms and that Google is evasive about this fact to prevent SEO professionals from influencing the rankings.
This data may not be used for much longer, as Law360 reported: The former Googler’s testimony said the ‘situation is changing rapidly,’ and that Google now has systems that can be trained just as well without user data.
“Great,” I said to myself, “How many conclusions do I need to reassess?”
Thankfully, none so far. My first thought was CTR, but we’re still dubious about CTR as a ranking factor, even with the new information.
There’s a difference between live ranking signals and data used for analysis.
Ex-Google Search Quality team member Pedro Dias has a great take on this, saying in a LinkedIn post,
“There’s a difference between:
- directly using a signal in rankings;
- looking at the data and assess which parts could be useful for rankings”
Using data to analyze results and train algorithms is much, much different from using it live in result delivery. These signals are more likely used for training and evaluation purposes than live results ordering.
Instead of focusing on click metrics just as a direct ranking signal, consider them as a measure of how your user interacts with your page – because that is what matters. So either way, it can be considered important.
If you’re focusing on what matters – content, authority, user experience – then whether CTR and other user behavior is a ranking factor shouldn’t change your overall strategy.
You don’t have control over click data; you can only use it for measurement.
While there is increasing reason to believe that “click data” is used in search as a feedback mechanism, it’s not helpful for you to focus on it as a needle to move. Use it the way Google does: as an assessment tool.
User Signals In Search
The more we find out and with each new event, the more open to speculation the issue of user data seems to become.
When it comes to Appen, I can see arguments in both directions. It could be that Google plans to rely on automated algorithms and aggregate user data instead of human quality ratings.
Or this could simply speak to a cost-cutting decision in the midst of layoffs and unfavorable legal judgments.
As for the declining quality of search results, in my opinion, that’s an argument against the idea that user behavior data is a ranking factor.
People are unsatisfied with search results and in quite large numbers.
This being the case, an algorithm that accounts for user behavior should see this and adjust, right? This presents four alternative situations in my mind:
- The algorithms are, to use a technical term, completely borked.
- User behavior and click data are not direct ranking signals.
- Both of the above.
- The fourth situation requires reading into a recent Google announcement about the upcoming Gemini AI model and speculating about its meaning. At the end of this post, we find this:
“We’re already starting to experiment with Gemini in Search, where it’s making our Search Generative Experience (SGE) faster for users, with a 40% reduction in latency in English in the U.S., alongside improvements in quality.”
There are two things going on here:
- “We’re already starting to experiment with Gemini in Search …”
- “… making our Search Generative Experience (SGE) faster …”
Gemini is at least in Labs. Are some elements of it in live Search too?
Will a Gemini release herald an SGE release?
This is happening fast. Google could well have decided that the current algorithms aren’t capable of solving the current issues, and are, instead, moving ahead as quickly as possible with Gemini. This could change what we know about ranking signals and systems.
Will Google Use Click / Behavior Data As Ranking Signals In The Future?
There is still an argument supporting the fact that Google uses, or at least would like to use, behavioral data to rank content.
In fact, it’s objectively true that it already does this in YouTube search.
Engagement is one of the three pillars of YouTube search. On YouTube, user engagement signals, in aggregate, directly impact a video’s ranking on the platform.
In explaining how the YouTube search algorithm works, the documentation says:
“At YouTube Search, we prioritize three main elements to provide the best search results: relevance, engagement and quality. These three elements are given differing importance based on the type of search.
To estimate relevance we look into many factors, such as how well the title, tags, description, and video content match your search query.
Engagement signals are a valuable way to determine relevance. We incorporate aggregate engagement signals from users, i.e. we may look at the watch time of a particular video for a particular query to determine if the video is considered relevant to the query by other users.
Finally, for quality, our systems are designed to identify signals that can help determine which channels demonstrate expertise, authoritativeness, and trustworthiness on a given topic.”
In its documentation for creators about how to grow a channel, YouTube says this:
“Insider tip: Our algorithm doesn’t pay attention to videos, it pays attention to viewers.
So, rather than trying to make videos that’ll make an algorithm happy, focus on making videos that make your viewers happy.”
This is a pretty good indication that Google would absolutely use behavior and click signals in search if it could do so reliably.
Therein lies the problem. On YouTube, all the data it needs is right there, contained inside the platform.
This isn’t the case for Google Search because not all websites use Google Analytics, and not all users use Chrome.
In addition, it’s much easier to interpret positive and negative engagement behaviors with videos than it is text.
I believe these two things to be true:
- Google knows that direct user feedback is the best way to determine whether content is “good” and would implement this into live results ordering in Search if it could.
- Currently, and previously, this was not achievable algorithmically.
Who knows, maybe further development of AI will present new solutions.
This is a very roundabout way of saying:
User behavior data is probably used in search to fine-tune and evaluate results, but probably not to make in-the-moment delivery decisions. Even if it was used this way, it shouldn’t matter to you all that much because you can only control engagement by making better content, which should be your goal anyway.
The more interesting question right now is how the heck do we, as SEO professionals, advise people to stand by content best practices while the search results seem to reward spam?
Still working on that one.
More resources:
Featured Image: Paulo Bobita/Search Engine Journal