Google updated their spam policies for web search and the guide to ranking systems to clarify how Google handles sites with a high number of non-consensual explicit imagery and requests for their removals.
The changes are to the policy that specifically mentions sites that charge for removal of negative information but the guidance also states that if they will also demote content on other sites that practice the same pattern of behavior.
Thus, a report about one site can trigger demotions of other sites that have similar kinds of exploitative removal practices.
Background Of Google’s Policies On Non-Consensual Explicit Imagery
The kind of imagery Google is demoting sites for is the sharing of intimate images without the consent of the party whose picture is being publicly shared on a website.
Google has been removing “revenge porn” from their search index since 2015. These changes continue Google’s evolution of their spam and ranking policies.
Changes to Google’s Spam Policies
These are the changes to Google’s spam and demotion guidance:
Original wording:
“If we process a high volume of personal information removals involving a site with exploitative removal practices, we demote other content from the site in our results.
We also look to see if the same pattern of behavior is happening with other sites and, if so, apply demotions to content on those sites.
We may apply similar demotion practices for sites that receive a high volume of doxxing content removals.
Furthermore, we have automatic protections designed to prevent non-consensual explicit personal images from ranking highly in response to queries involving names.”
This was added:
“removals or non-consensual explicit imagery removals.”
The new version now reads (emphasis on additional words ):
“If we process a high volume of personal information removals involving a site with exploitative removal practices, we demote other content from the site in our results.
We also look to see if the same pattern of behavior is happening with other sites and, if so, apply demotions to content on those sites.
We may apply similar demotion practices for sites that receive a high volume of doxxing content removals or non-consensual explicit imagery removals.”
Perhaps of interest to some is the removal of a reference to automatic systems for removing this kind of content.
This is what was removed:
“Furthermore, we have automatic protections designed to prevent non-consensual explicit personal images from ranking highly in response to queries involving names.”
Why did Google remove that passage?
Is it because it said too much or because the system no longer exists? Or was it removed because it was redundant with the part that already mentions demotions?
I think it’s the latter, that it was removed because it was redundant.
Search Ranking Systems Guidance Updated
A similar edit was made to Google’s Guide to Search Ranking Systems where the same sentence about the “automatic protections” was entirely removed, possibly because it was redundant.
But new wording was added to the last sentence that detailed what would trigger a removal demotions from Google’s search results.
The additional reason for demotion is sites that experience a high level of “non-consensual explicit imagery removals” requests.
The updated passage, with the section about “automatic systems” removed, now reads like this:
“Personal information removals: If we process a high volume of personal information removals involving a site with exploitative removal practices, we demote other content from the site in our results. We also look to see if the same pattern of behavior is happening with other sites and, if so, apply demotions to content on those sites. We may apply similar demotion practices for sites that receive a high volume of doxxing content removals or non-consensual explicit imagery removals.”