Disallow and Allow directives
- Disallow
- Allow
- Combining directives
- Allow and Disallow directives without parameters
- Using the special characters * and $
- Examples of how directives are interpreted
Disallow
- Pages that contain confidential data.
- Pages with site search results.
- Site traffic statistics.
- Duplicate pages.
- Various logs.
- Database service pages.
Examples:
User-agent: Yandex
Disallow: / # prohibits crawling the entire site
User-agent: Yandex
Disallow: /catalogue # prohibits crawling the pages that start with /catalogue
User-agent: Yandex
Disallow: /page? # Prohibits crawling the pages with a URL that contains parameters
Allow
This directive allows indexing site sections or individual pages.
Examples:
User-agent: Yandex
Allow: /cgi-bin
Disallow: /
# prohibits downloading everything except pages
# that start with '/cgi-bin'
User-agent: Yandex
Allow: /file.xml
# allows downloading the file.xml file
User-agent
, Disallow
and Allow
directives.Combining directives
The Allow
and Disallow
directives from the corresponding User-agent
block are sorted according to URL prefix length (from shortest to longest) and applied in order. If several directives match a particular site page, the robot selects the last one in the sorted list. This way the order of directives in the robots.txt file doesn't affect the way they are used by the robot.
Allow
directive takes precedence.# Source robots.txt:
User-agent: Yandex
Allow: /
Allow: /catalog/auto
Disallow: /catalog
# Sorted robots.txt:
User-agent: Yandex
Allow: /
Disallow: /catalog
Allow: /catalog/auto
# prohibits downloading pages that start with '/catalog',
# but allows downloading pages that start with '/catalog/auto'.
Common example:
User-agent: Yandex
Allow: /archive
Disallow: /
# allows everything that contains '/ archive', the rest is forbidden
User-agent: Yandex
Allow: /obsolete/private/*.html$ # allows HTML files
# located at '/obsolete/private/...'
Disallow: /*.php$ # prohibits all '*.php' on this site
Disallow: /*/private/ # prohibits all sub-paths that contain
# '/private/', but the Allow above cancels
# part of the prohibition
Disallow: /*/old/*.zip$ # prohibits all '*.zip' files that contain
# '/old/' in the path
User-agent: Yandex
Disallow: /add.php?*user=
# prohibits all 'add' scripts.php?' with the 'user' option
Allow and Disallow directives without parameters
If directives don't contain parameters, the robot handles the data as follows:
User-agent: Yandex
Disallow: # same as Allow: /
User-agent: Yandex
Allow: # isn't taken into account by the robot
Using the special characters * and $
You can use the special characters * and $ to set regular expressions when specifying paths for the Allow and Disallow directives.
The * character indicates any sequence of characters (or none). Examples:
User-agent: Yandex
Disallow: /cgi-bin/*.aspx # prohibits '/cgi-bin/example.aspx'
# and '/cgi-bin/private/test.aspx'
Disallow: /*private # prohibits not only '/private',
# but also '/cgi-bin/private'
By default, the * character is appended to the end of every rule described in the robots.txt file. Example:
User-agent: Yandex
Disallow: /cgi-bin* # blocks access to pages
# that start with with с '/cgi-bin'
Disallow: /cgi-bin # the same
To cancel * at the end of the rule, use the $ character, for example:
User-agent: Yandex
Disallow: /example$ # prohibits '/example',
# but doesn't prohibit '/example.html'
User-agent: Yandex
Disallow: /example # prohibits both '/example',
# and '/example.html'
The $ character doesn't forbid the * at the end, that is:
User-agent: Yandex
Disallow: /example$ # prohibits only '/example'
Disallow: /example*$ # same as 'Disallow: / example'
# prohibits both /example.html and /example
Examples of how directives are interpreted
User-agent: Yandex
Allow: /
Disallow: /
# everything is allowed
User-agent: Yandex
Allow: /$
Disallow: /
# allows everything except for the home page
User-agent: Yandex
Disallow: /private*html
# prohibits '/private*html',
# '/private/test.html', and '/private/html/test.aspx', and so on.
User-agent: Yandex
Disallow: /private$
# prohibits only '/private'
User-agent: *
Disallow: /
User-agent: Yandex
Allow: /
# Because Yandex robot
# selects entries by the presence in the line ' User-agent:',
# result — everything is allowed