Using WordPress ‘do_robots()’ PHP function

The do_robots() WordPress PHP function displays the default robots.txt file content.

Usage

Here’s a general example of how you might use the do_robots() function:

add_action('do_robotstxt', 'custom_robots');

function custom_robots() {
    do_robots();
}

In this example, we are simply calling the do_robots() function when the action do_robotstxt is triggered.

Parameters

This function has no parameters.

More Information

See WordPress Developer Resources: do_robots()

Examples

Basic Use

To output the robots.txt content, you can call the function like this:

do_robots();

This will output the default robots.txt file content.

Disallow Specific File Types

To add disallow rules for specific file types in the robots.txt file, you could use the robots_txt filter:

add_filter( 'robots_txt', 'disallow_filetypes', 10, 2);

function disallow_filetypes($output, $public) {
    $disallowed_filetypes = array('jpeg','jpg','gif','png','mp4','webm','woff','woff2','ttf','eot');

    foreach($disallowed_filetypes as $ext) {
        $output .= "\nDisallow: /*.{$ext}$";
    }

    return $output;
}

This code adds disallow rules for several file types such as .jpeg, .jpg, .gif, etc.

Disallow WP Login Page

To disallow access to the wp-login.php page, we can add a filter to the robots_txt:

add_filter( 'robots_txt', 'disallow_login_page', 10, 2 );

function disallow_login_page($output, $public) {
    $site_url = parse_url( site_url() );
    $path = ( ! empty( $site_url['path'] ) ) ? $site_url['path'] : '';
    $output .= "\nDisallow: $path/wp-login.php";
    return $output;
}

This code adds a line to the robots.txt file to disallow access to the wp-login.php page.

Remove Allow Rule for AJAX Interface

To remove the line that allows robots to access the AJAX interface, use the robots_txt filter:

add_filter( 'robots_txt', 'remove_ajax_access', 10, 2 );

function remove_ajax_access($output, $public) {
    $output = preg_replace( '/Allow: [^\0\s]*\/wp-admin\/admin-ajax\.php/', '', $output );
    return $output;
}

This code removes the line from robots.txt that allows robots to access the admin AJAX interface.

To add a link to your sitemap in the robots.txt file, you can use the robots_txt filter:

add_filter( 'robots_txt', 'add_sitemap_link', 10, 2 );

function add_sitemap_link($output, $public) {
    $site_url = parse_url( site_url() );
    $output .= "\nSitemap: {$site_url['scheme']}://{$site_url[ 'host' ]}/sitemap_index.xml";
    return $output;
}

This code adds a link to your sitemap in the robots.txt file.