Can I use robots.txt to optimize Googlebot's crawl?
У вашего броузера проблема в совместимости с HTML5
Can I use robots.txt to optimize Googlebot's crawl? For example, can I disallow all but one section of a site (for a week) to ensure it is crawled, and then revert to a 'normal' robots.txt?
Blind Five Year Old, SF, CA