What is robots.txt?
Robots.txt is a file we'll place on your website which asks that specific files or pages on your website be ignored by search engines. Excluding such files or pages from the search engines could be as a result of a preference for privacy, or that the content on those files or pages isn't relevant to your site categorization as a whole. Implementing a robots.txt file does not guarantee that these files or pages will not appear in search engine result pages, it's more of a way of telling the search engines that "you can ignore these these files and pages over here."





