As finding the archives €œrobots.txt€?

A file €œrobots.txt€ is generated automatically and you can accede to adding €œrobots.txt€ at the end of the name of your website for example

This file contains the following directives:

  • User-Agent: It means that the specified configuration next is valid for all the dredges Web.
  • Allow and Disallow: Of predetermined form all the pages of your website are indexed to the web search engines, without embago is possible to hide some pages of indexing.
  • Sitemap: It says to the web search engines as finding directions in the map of the site, a file €œsitemap.xml€ is generated automatically.
  • Host: This directive appears if a connected website this to one or more domains. It indicates the main domain from your website to the dredges Web.