How to Build a robots.txt File

First of all, it’s not a necessity for every website to have a robots.txt file. This file exists to tell the search engines you want them to limit their access to specific pages or directories of a website, so without one, the entire site would conceivably be indexed. It can address either all or just specific bots.

Depending upon the site, of course, allowing the search engine to index any and all pages could present a problem. For instance, if an ecommerce site with on-site search function doesn’t limit access at all, then every single search would result in a new URL being created – which could appear in Google’s search results (as well as any other search engine’s SERPs). This can easily result in a site with only 400-ish pages having tens of thousands of pages [Read more…]