As finding the archives robots.txt?
A file robots.txt is generated automatically and you can accede to adding robots.txt at the end of the name of your website for example https://advancedigg.com/robots.txt.
This file contains the following directives:
- User-Agent: It means that the specified configuration next is valid for all the dredges Web.
- Allow and Disallow: Of predetermined form all the pages of your website are indexed to the web search engines, without embago is possible to hide some pages of indexing.
- Sitemap: It says to the web search engines as finding directions in the map of the site, a file sitemap.xml is generated automatically.
- Host: This directive appears if a connected website this to one or more domains. It indicates the main domain from your website to the dredges Web.