A robots.txt is a simple text file placed on the root server of the website. It consists of the information for the various search engine spiders not to crawl or index certain section(s) of the website. It can be used to,
Any text editor can be used for the same.
Syntax
User-Agent : [Spider Name or Bot Name]
Disallow : [Directory or Filename]
- Prevent - TOTAL Indexing
- Or, Certain Areas of Website
- Or, Certain Search Engines, providing Individual Indexing Instructions
Any text editor can be used for the same.
Syntax
User-Agent : [Spider Name or Bot Name]
Disallow : [Directory or Filename]
Comments