Google’s John Mueller answers four rapid fire questions about common technical SEO issues that almost everyone runs into at one point or another.

Mueller addresses questions sent in by people related to:

  • Blocking CSS files
  • Updating sitemaps
  • Re-uploading a site to the web
  • Googlebot’s crawl budget

These questions are answered in the latest installment of the Ask Googlebot video series on YouTube.

Traditionally, those videos focus on answering one specific question with as much detail as Google is able to provide.

Advertisement

Continue Reading Below

However, not every question about SEO takes a whole video to answer. Some can be answered in one or two sentences.

Here are some quick answers to questions that are often asked by people just getting started in SEO.

Can Blocking CSS Files In Robots.txt Affect Rankings?

Yes, blocking CSS can cause issues, and Mueller says you should avoid doing that.

When CSS is blocked in robots.txt Googlebot is not able to render a page as visitors would see it.

Being able to see a page completely helps Google understand it better and confirm that it’s mobile-friendly.

Advertisement

Continue Reading Below

That all contributes to a webpage’s ranking in search results.

How Should I Update The Sitemap For My Website?

There’s no common simple solution for updating sitemaps that works across all websites, Mueller says.

However, most website setups have built-in solutions of their own.

Consult your site’s help guides for a sitemap setting, or for a compatible plugin that creates sitemap files.

It’s usually just a matter of turning a setting on and you’re all set.

What Is The Correct Way To Reintroduce A Site To Google?

It’s not possible to reset indexing for a website by deleting its files and re-uploading them.

Google will automatically focus on the newest version of a site and drop the old version over time.

Advertisement

Continue Reading Below

You can move this process along faster by using redirects from any old URLs to the new ones.

Would Deleting RSS Feeds Improve Googlebot Crawling?

A person writes in to Mueller saying 25% of Googlebot’s crawl budget is going to the RSS feed URLs that are in the head of every page.

Advertisement

Continue Reading Below

They ask if deleting the RSS feeds would improve crawling.

Mueller says the RSS feeds are not problematic, and Google’s systems balance crawling across a website automatically.

Sometimes that results in Google crawling certain pages more often, but pages will only be re- crawled after Googlebot has seen all the important pages at least once.

Advertisement

Continue Reading Below


Featured Image: Screenshot from YouTube.com/GoogleSearchCentral





Source link

By Rose Milev

I always want to learn something new. SEO is my passion.

Leave a Reply

Your email address will not be published. Required fields are marked *