Using WordPress ‘do_robotstxt’ PHP action

The do_robotstxt WordPress PHP action fires when displaying the robots.txt file.

Usage

add_action('do_robotstxt', 'your_custom_function');
function your_custom_function() {
    // Your custom code here
}

Parameters

  • None

More information

See WordPress Developer Resources: do_robotstxt

Examples

Add a custom rule to robots.txt

Add a rule to disallow crawling of a specific directory:

add_action('do_robotstxt', 'my_custom_robotstxt');
function my_custom_robotstxt() {
    echo "Disallow: /my-private-directory/\n";
}

Allow crawling for all user agents

Allow all user agents to crawl the entire site:

add_action('do_robotstxt', 'allow_all_user_agents');
function allow_all_user_agents() {
    echo "User-agent: *\n";
    echo "Allow: /\n";
}

Disallow crawling for a specific user agent

Disallow crawling for a specific user agent (e.g. Googlebot):

add_action('do_robotstxt', 'disallow_googlebot');
function disallow_googlebot() {
    echo "User-agent: Googlebot\n";
    echo "Disallow: /\n";
}

Add a custom sitemap URL

Include a custom sitemap URL in the robots.txt file:

add_action('do_robotstxt', 'add_custom_sitemap');
function add_custom_sitemap() {
    echo "Sitemap: https://example.com/my-custom-sitemap.xml\n";
}

Add crawl delay for all user agents

Add a crawl delay of 10 seconds for all user agents:

add_action('do_robotstxt', 'add_crawl_delay');
function add_crawl_delay() {
    echo "User-agent: *\n";
    echo "Crawl-delay: 10\n";
}