How to monitor and manage indexing?

In Yandex Webmaster, you can:

  • Learn when the indexing bot crawled your site and what the result was.
  • Find out which pages of your site are included in the search, which were excluded, and for what reasons.
  • View page statistics in the search by site section.
  • See the status in the search for a specific page.
  • Understand the status in the search of important site pages.

In Yandex Webmaster, you can also control the speed Yandex bots crawl your site at and check whether indexing is configured correctly in the robots.txt file.


Crawl statistics

To view the Yandex bot crawl statistics for your site, go to IndexingCrawl statistics.

You can see the crawl history in the chart. The table shows the response codes returned by the page when the robot attempted to access it.

Learn more:


Page check

To check the status of a specific page in the search, go to IndexingCheck page. Select Desktop or Mobile and enter the URL of the page you want to check. After the check is over, click Learn more.

Learn more:


Searchable pages

To find out which pages of your site are included in the search, which were excluded, and for what reasons, go to IndexingSearchable pages. Excluded pages with their exclusion reasons are listed in a table in the Excluded pages tab.

Learn more:


Site structure

You can view statistics by site section by going to IndexingSite structure. To learn more, click numbers in column cells: this will bring you to the Crawl statistics or Searchable pages page.

For example, if you want to find out why not all crawled pages within a section are included in the search:

  1. Go to the Site structure page and click the number next to the section you need in the In search column.

  2. On the Searchable pages page, select a section from the drop-down list under the site domain and open the Excluded pages tab in the table.

    The reasons for the page exclusion from the search are specified in the table, in the Status column. For more details, click the three dots.

Learn more:


Important page monitoring

To monitor the status of important pages, add them to the special Yandex Webmaster tool: IndexingImportant page monitoring. You can add important pages manually, select them from the list of suggestions, or specify them here: IndexingReindex pages.

To stay up to date on the latest changes, subscribe to the Important pages notifications.

Learn more:


Crawl rate

On the IndexingCrawl rate page, you can control the number of requests per second that the indexing bot sends to your site.

By default, the Optimize automatically option is selected. This means that the crawl rate is calculated using algorithms so that the indexing bot can load the maximum number of pages without overloading the server.

Alternatively, you can set the crawl rate manually. For example, you can increase it if you've recently added many new sections and pages to your site. After these changes are indexed, you can reduce the crawl rate or select Optimize automatically.

Learn more:


The robots.txt file

The robots.txt file contains instructions for indexing bots on how to index your site. You can use this file to allow indexing of specific sections or individual pages of your site, or to prevent them from being indexed. This is useful when you need to reduce the load on your site, block pages with personal user information from indexing, or exclude duplicate pages from search results (for example, based on UTM tags).

In Yandex Webmaster, you can check if your robots.txt file is filled in correctly. To do this, go to the ToolsRobots.txt analysis page.

On the same page, you can check whether indexing is allowed or prohibited for specific site pages by using the Check if URLs are allowed field.

Learn more: