Not too long ago one in every of our readers requested us for tips about how to optimize the robots.txt file to enhance SEO. Robots.txt file tells search engines like google and yahoo how to crawl your web site which makes it an extremely highly effective SEO instrument. On this article, we are going to present you the way to create an ideal robots.txt file for SEO.

Using WordPress robots.txt file to improve SEO

What’s robots.txt file?

Robots.txt is a textual content file that web site house owners can create to inform search engine bots how to crawl and index pages on their web site.

It’s sometimes saved in the basis listing often known as the principle folder of your web site. The fundamental format for a robots.txt file appears like this:

Consumer-agent: [user-agent name]Disallow: [URL string not to be crawled]Consumer-agent: [user-agent name]Permit: [URL string to be crawled]Sitemap: [URL of your XML Sitemap]

You may have a number of traces of directions to enable or disallow particular URLs and add a number of sitemaps. If you don’t disallow a URL, then search engine bots assume that they’re allowed to crawl it.

Here’s what a robots.txt instance file can appear to be:

Consumer-Agent: *Permit: /wp-content/uploads/Disallow: /wp-content/plugins/Disallow: /wp-admin/Sitemap: https://instance.com/sitemap_index.xml

Within the above robots.txt instance, we now have allowed search engines like google and yahoo to crawl and index recordsdata in our WordPress uploads folder.

After that, we now have disallowed search bots from crawling and indexing plugins and WordPress admin folders.

Lastly, we now have supplied the URL of our XML sitemap.

Do You Want a Robots.txt File for Your WordPress Web site?

If you happen to don’t have a robots.txt file, then search engines like google and yahoo will nonetheless crawl and index your web site. Nevertheless, you won’t be able to inform search engines like google and yahoo which pages or folders they need to not crawl.

This won’t have a lot of an affect whenever you’re first beginning a weblog and wouldn’t have plenty of content material.

Nevertheless as your web site grows and you’ve got plenty of content material, then you definitely would seemingly need to have higher management over how your web site is crawled and listed.

Right here is why.

Search bots have a crawl quota for every web site.

Because of this they crawl a sure variety of pages throughout a crawl session. In the event that they don’t end crawling all pages in your web site, then they are going to come again and resume crawl in the following session.

This will decelerate your web site indexing price.

You may repair this by disallowing search bots from making an attempt to crawl pointless pages like your WordPress admin pages, plugin recordsdata, and themes folder.

By disallowing pointless pages, you save your crawl quota. This helps search engines like google and yahoo crawl much more pages in your web site and index them as rapidly as attainable.

One other good motive to use robots.txt file is whenever you need to cease search engines like google and yahoo from indexing a put up or web page in your web site.

It isn’t the most secure method to cover content material from most people, however it would make it easier to forestall them from showing in search outcomes.

What Does an Ultimate Robots.txt File Ought to Look Like?

Many standard blogs use a quite simple robots.txt file. Their content material could range, relying on the wants of the particular web site:

Consumer-agent: *Disallow: Sitemap: http://www.instance.com/post-sitemap.xmlSitemap: http://www.instance.com/page-sitemap.xml

This robots.txt file permits all bots to index all content material and offers them a hyperlink to the web site’s XML sitemaps.

For WordPress websites, we advocate the next guidelines in the robots.txt file:

Consumer-Agent: *Permit: /wp-content/uploads/Disallow: /wp-content/plugins/Disallow: /wp-admin/Disallow: /readme.htmlDisallow: /refer/Sitemap: http://www.instance.com/post-sitemap.xmlSitemap: http://www.instance.com/page-sitemap.xml

This inform search bots to index all WordPress photographs and recordsdata. It disallows search bots from indexing WordPress plugin recordsdata, WordPress admin space, the WordPress readme file, and affiliate hyperlinks.

By including sitemaps to robots.txt file, you make it simple for Google bots to discover all of the pages in your web site.

Now that what a great robots.txt file appear to be, let’s check out how one can create a robots.txt file in WordPress.

How to Create a Robots.txt File in WordPress?

There are two methods to create a robots.txt file in WordPress. You may select the strategy that works greatest for you.

Technique 1: Modifying Robots.txt File Utilizing Yoast SEO

In case you are utilizing the Yoast SEO plugin, then it comes with a robots.txt file generator.

You should use it to create and edit a robots.txt file straight out of your WordPress admin space.

Merely go to SEO » Instruments web page in your WordPress admin and click on on the File Editor hyperlink.

File editor tool in Yoast SEO

On the following web page, Yoast SEO web page will present your present robots.txt file.

If you happen to don’t have a robots.txt file, then Yoast SEO will generate a robots.txt file for you.

Create robots.txt file using Yoast SEO

By default, Yoast SEO’s robots.txt file generator will add the next guidelines to your robots.txt file:

Consumer-agent: *Disallow: /

It’s vital that you simply delete this textual content as a result of it blocks all search engines like google and yahoo from crawling your web site.

After deleting the default textual content, you possibly can go forward and add your individual robots.txt guidelines. We advocate utilizing the perfect robots.txt format we shared above.

When you’re finished, don’t neglect to click on on the ‘Save robots.txt file’ button to retailer your adjustments.

Technique 2. Edit Robots.txt file Manually Utilizing FTP

For this methodology, you have to to use an FTP consumer to edit robots.txt file.

Merely join to your WordPress internet hosting account utilizing an FTP consumer.

As soon as inside, it is possible for you to to see the robots.txt file in your web site’s root folder.

Editing WordPress robots.txt file using FTP

If you happen to don’t see one, then you definitely seemingly don’t have a robots.txt file. In that case, you possibly can simply go forward and create one.

Create robots.txt file using FTP

Robots.txt is a plain textual content file, which implies you possibly can obtain it to your laptop and edit it utilizing any plain textual content editor like Notepad or TextEdit.

After saving your adjustments, you possibly can add it again to your web site’s root folder.

How to Take a look at Your Robots.txt File?

After you have created your robots.txt file, it’s all the time a good suggestion to check it utilizing a robots.txt tester instrument.

There are a lot of robots.txt tester instruments on the market, however we advocate utilizing the one inside Google Search Console.

Merely login to your Google Search Console account, after which swap to the previous Google search console web site.

Switch to old Google Search Console

This may take you to the previous Google Search Console interface. From right here you want to launch the robots.txt tester instrument positioned below ‘Crawl’ menu.

Robots.txt tester tool

The instrument will routinely fetch your web site’s robots.txt file and spotlight the errors and warnings if it discovered any.

Ultimate Ideas

The objective of optimizing your robots.txt file is to forestall search engines like google and yahoo from crawling pages that aren’t publicly out there. For instance, pages in your wp-plugins folder or pages in your WordPress admin folder.

A typical delusion amongst SEO consultants is that blocking WordPress class, tags, and archive pages will enhance crawl price and outcome in quicker indexing and better rankings.

This isn’t true. It’s additionally in opposition to Google’s webmaster pointers.

We advocate that you simply observe the above robots.txt format to create a robots.txt file for your web site.

We hope this text helped you find out how to optimize your WordPress robots.txt file for SEO. You may additionally need to see our final WordPress SEO information and the perfect WordPress SEO instruments to develop your web site.