Google Search Advocate Daniel Waisberg recently presented an in-depth video on bulk data exports, a feature that allows you to export, store, and analyze Search Console data.

This novel solution surpasses capabilities and makes managing enormous data volumes a breeze.

Here’s how.

An Overview of Current Data-Exporting Solutions

Before introducing the bulk data export feature, Waisberg recapped the existing methods to export Search Console data.

The most accessible way is through the user interface. You can directly export up to 1,000 rows of data with a simple click on the export button.

Looker Studio and the API provide solutions for people requiring larger data volumes. Both channels allow you to retrieve performance data, URL inspection data, sitemaps, and site data, with an export limit of up to 50,000 rows.

Introducing Bulk Data Export

The final and most advanced method to export data from Search Console is bulk data export.

This unique feature lets you extract vast amounts of data via Google BigQuery without row limits. This is beneficial for large websites with numerous pages or extensive traffic.

Waisberg states, “A bulk data export is a scheduled daily export of your Search Console performance data. It includes all the data used by Search Console to generate performance reports. Data is exported to Google BigQuery, where you can run SQL queries for advanced data analysis or even export it to another system.”

Setting Up Bulk Data Export

Given its complexity and power, Bulk Data Export requires existing knowledge of the Google Cloud Platform, BigQuery, and Search Console.

Be aware that leveraging this tool might incur costs, so it’s crucial to consider the potential charges before setting up a new export.

Setting up a bulk data export involves Google Cloud and Search Console.

Step One: Google Cloud

First, switch to the relevant project in Google Cloud and ensure the BigQuery API is enabled.

  1. Open your Google Cloud Console and switch to the project you’re exporting data to.
  2. Navigate to APIs & Services > Enabled APIs & Services, and enable BigQuery API if it isn’t.
  3. Navigate to IAM and Admin, click + GRANT ACCESS, and paste search-console-data-export@system.gserviceaccount.com in New Principals.
  4. Grant two roles to this account: BigQuery Job User and BigQuery Data Editor, then Save.

Step Two: Search Console

In Search Console, complete the following steps:

  1. Navigate to Settings > Bulk data export.
  2. Input your Google Cloud project ID into the Cloud project ID field.
  3. Choose a dataset name. The default is ‘searchconsole’.
  4. Select a location for your dataset. This can’t be easily changed later.
  5. Click Continue to start the exports. The first export will happen up to 48 hours after successful configuration.
  6. After table creation, set a partition expiration if needed but avoid schema alterations.
  7. For historical data preceding the initial setup, use Search Console API or reports.

Monitoring & Managing Data Exports

The new data export system has a built-in feature that lets you monitor data exports using BigQuery. For example, you can track exports using an export log table.

Note that data will accumulate indefinitely unless you set an expiration time. The export process will continue until manually deactivated or if Search Console faces issues.

In case of any errors, Search Console will notify all property owners.

In Summary

In conclusion, the bulk data export feature can enhance how you manage large amounts of Search Console data.

Stay tuned for upcoming content from Google that will delve deeper into processing data after setting up export and best practices for extracting data from BigQuery.


Source: YouTube

Featured image generated by the author using Midjourney.





Source link

Avatar photo

By Rose Milev

I always want to learn something new. SEO is my passion.

Leave a Reply

Your email address will not be published. Required fields are marked *