Why Your Site Needs a Robots.txt File – Best Practice

Why Your Site Needs a Robots.txt File – Best Practice

10 December 2020 |

SEM and SEO

What is a Robots.txt file, and how do you make sure yours is following best practice to communicate effectively with Google?

A Robots.txt file is a simple text file uploaded to the root directory of your website (e.g. https://dev.admatic.com/robots.txt), which provides Google with a basic set of instructions on what it is and isn’t allowed to access on your site.

This simple file is the first item that the search engine bots look to find on your site upon hitting your domain for crawling. The absence of a Robots.txt file not only leaves Google guessing as to what it should and shouldn’t be crawling, but means missing out on a great opportunity to point it directly to the key pages that you’d like to rank.

So what does best practice for a Robots.txt file involve? While there’s a plethora of instructions that can be included, at the most basic level you want to point Google straight to your XML sitemap. The XML sitemap contains a full list of all site pages you would like to rank, and offers a way for Google to bypass crawling menus to locate these.

Does your website menu use animated drop-downs and roll-outs to improve the design for users? Then the Robots.txt file directive to your XML sitemap becomes even more important, with Google bots currently unable to execute these animations and follow the links to the key pages on your site.

The following three lines of code are all that is needed to create a best practice Robots.txt file. The star symbol defines Google as the ‘User-Agent’, the ‘Sitemap’ link points it to your list of pages, and the empty ‘Disallow’ line indicates that there are no pages off-limits:

User-Agent: *

Disallow:

Sitemap: https://www.dev.admatic.com/sitemap_index.xml

Should you wish to block the search engines from accessing certain parts of your website, then these pages and/or their directories can be listed next to ‘Disallow’. Typically this would include any secure pages on your website, i.e. those that are involved for customers logging in, logging out or in the processing of purchases for ecommerce.

Want to get a Robots.txt file uploaded for your site but not sure where to start? Get in touch with ADMATIC today to find out how we can assist with your SEO strategy.