What are the uses of robots.txt file while going for the On-page optimization?

Hi,
I am currently optimizing a website and heard several time about robots.txt files. What are they and how to use them?
Thanks
jacobmarsh789
Asked Oct 29, 2012
Robots.txt files are used to prevent or allow search engine robots to access the web pages. The following is format of robots.txt files
User-agent: *
Disallow: /

In which the first line of above ‘User-agent: *’ allows all robots to crawl web pages, while second line ' Disallow: /’tells robots that they should not visit any page on the website.
steveallen
Answered Oct 29, 2012
Robots.txt is text file that is placed on the website in order to inform search engine robots which web pages it should crawl or index and which web pages it should avoid. Robots.txt is not mandatory for all websites but generally search engines follow what it is informed. To know more about robots.txt you can visit:- http://www.webconfs.com/what-is-robots-txt-article-12.php
edwinmiller
Answered Mar 30, 2013
It is single most important elements of on page optimization and robots.txt is a text file used to give instructions to the search engine crawlers about the caching and indexing of a webpage, domain, directory or a file of a website. If you want to learn some good thing about robot.txt then you go for: www.seomoz.org/learn-seo/robotstxt.
beatrixtuffy
Answered Apr 17, 2013
Edited Apr 17, 2013

TIP: If it's not your answer to this question, please click "Leave a Comment" button under the question to communicate with the question owner.

Categories