Google’s Martin Splitt addressed a question about website trustworthiness and whether competitors can negatively impact it. He explained how Google assesses site trustworthiness and clarified why factors like links and site traffic don’t have a negative influence on Google’s perception of trustworthiness.

Trustworthiness

Googlers, research papers and patents mention the trustworthiness of websites but there is no actual trust metric in use at Google. It was confirmed at one time a long time ago that there are multiple signals that together indicate if a site could be trusted but that’s not a trust algorithm, those are just signals.

When Googlers talk about whether a site is trustworthy it’s probably best to not overthink it, they’re just talking about whether a site is trustworthy.

Can A Competitor Create Negative Trustworthiness Signals?

The person asking the question was worried about a competitor that was sending bot traffic to their site in what they felt was an effort to make their site appear to be untrustworthy by Google’s algorithm.

That might be a reference to an SEO idea that Google uses click metrics to rank web pages but most research papers about clicks are using clicks to validate search results, not for ranking web pages, it’s generally a quality assurance thing.

This is the question that was asked:

“Do I have to be concerned about bad actors trying to make our site appear untrustworthy by sending spam or fake traffic to my site? Since site trustworthiness is binary.”

Binary means it’s either this or that. In this case the person asking the question probably means a site is either trustworthy or untrustworthy with no gray areas in between.

Martin Splitt downplayed the idea of a binary quality to trustworthiness and outright denied that traffic could influence how Google sees a site.

He answered:

“It’s not really binary and just by sending traffic from questionable sources to a site, that won’t be ‘tainted’.”

“Spam or fake traffic” is not something that can negatively influence trust.

Martin explained that if a site itself is spammy then it’s going to be seen as spammy. He then confirmed that what other sites do in terms of linking or traffic has no effect on whether a site looks spammy or not.

He answered:

“If a site itself does shady things, such as spam, malware, sure, that’s a problem, but nobody gets to choose or control where traffic or links are coming from, so that’s not something Google Search will look at to judge a website’s trustworthiness.”

Bot Traffic Doesn’t Affect How Google Sees A Site

Pretty much every website experiences high levels of hacker bots probing around looking for vulnerabilities. Some bots repeatedly hit a site looking for non-existent pages. That’s just the state of the web, every site experiences that.

So what Martin said about third parties being unable to make another site appear to be untrustworthy makes sense, especially when it’s understood that all sites have low quality inbound links and low quality bot traffic.

Watch the SEO Office Hours podcast at the 18:48 minute mark:

Featured Image by Shutterstock/Krakenimages.com



Source link

Avatar photo

By Rose Milev

I always want to learn something new. SEO is my passion.

Leave a Reply

Your email address will not be published. Required fields are marked *