Thanks to the Robots.txt file, we can guide the finders in the tracking of a Web and avoid that they store in its certain folder, data base URLs or any file that you do not want to index in the finders.
In order to create robots txt basic simply we must create a document with the â€śNotepadâ€? or similar, with the name of â€śrobots.txtâ€?, in this document we will put as heading:
and next Disallow:
Behind the Disallow we will put what we do not want that is indexed, example:
I recommend the following tool to you, very easy to use and that robots.txt creates to you automatically: http://www.mcanerin.com/EN/search-engine/robots-txt.asp
If already you are out of harmony of some robots.txt and you want to see if they have some failure, podeis to use the following tool: http://tool.motoricerca.info/robots-checker.phtml
Freelance programmer Web and analyst SEO in (positioning Web), I offer design Web in to companies of the sector with white brand. Sergi PĂ©rez