WordPress at a glance

do_robots() WP 2.1.0

Displays the robots.txt file content. Sets appropriate HTTP headers. Used for creation of dynamic robots.txt file.

It's better to use this function for robots.txt creation, because so the plugins can modify its content.

This function contains: do_robotstxt action at the beginning; robots_txt filter at the end (filters the robots.txt content).

You can read more about robots.txt in my article.

The function sends HTTP headers, so using it after the HTTP headers have already been sent will cause an error.

Hooks in function

Nothing (null).


#1 Create dynamic robots.txt

Add this code in functions.php (in this case we don't need to create a physical robots.txt file):

add_action( 'do_robotstxt', 'my_robotstxt' );
function my_robotstxt() {
	echo 'User-agent: *' . PHP_EOL;
	echo 'Disallow: /wp-admin/' . PHP_EOL;
	echo 'Disallow: /wp-includes/' . PHP_EOL;

	die; // stop further PHP execution

Now, when going to http://example.com/robots.txt, the following text will be displayed:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/

#2 Modify robots.txt content using robots_txt filter

Let's add "Disallow: */comments" rule.

add_filter('robots_txt', 'add_robotstxt');
function add_robotstxt($text){
	$text .= "Disallow: */comments";
	return $text;

So now http://example.com/robots.txt contains the following text:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: */comments

Code of do robots: wp-includes/functions.php WP 5.2.1

function do_robots() {
	header( 'Content-Type: text/plain; charset=utf-8' );

	 * Fires when displaying the robots.txt file.
	 * @since 2.1.0
	do_action( 'do_robotstxt' );

	$output = "User-agent: *\n";
	$public = get_option( 'blog_public' );
	if ( '0' == $public ) {
		$output .= "Disallow: /\n";
	} else {
		$site_url = parse_url( site_url() );
		$path     = ( ! empty( $site_url['path'] ) ) ? $site_url['path'] : '';
		$output  .= "Disallow: $path/wp-admin/\n";
		$output  .= "Allow: $path/wp-admin/admin-ajax.php\n";

	 * Filters the robots.txt output.
	 * @since 3.0.0
	 * @param string $output Robots.txt output.
	 * @param bool   $public Whether the site is considered "public".
	echo apply_filters( 'robots_txt', $output, $public );

Related Functions

From category: Uncategorized

No comments
    Hello, !     Log In . Register