Robots Control
This section discusses ways to use a Robots.Txt file correctly and without harming your site. A Robots.Txt file will help you control the various web robots and crawlers that are out there. Robots on the web are used by search engines like Google and Bing, as well as link research tools that crawl for link data, and spammers who do undesirable things with them.
Here are key topics to keep in mind when thinking about using a Robots.Txt file on your site(s):
- robots.txt and robots noindex meta tags can be used to prevent search engines from indexing some parts of your website.
- link rel=nofollow can be used to prevent links from passing link authority.
- If Google or Yahoo are placing directory data about your site in the search results you can override that by using the noodp and noydir meta tags.