Should I block duplicate pages using robots.txt?

12 273
141.1
Следующее
Популярные
15.05.23 – 19 84810:27
SEO best practices for news sites
Опубликовано 10 марта 2010, 19:25
Halfdeck from Davis, CA asks: "If Google crawls 1,000 pages/day, Googlebot crawling many dupe content pages may slow down indexing of a large site. In that scenario, do you recommend blocking dupes using robots.txt or is using META ROBOTS NOINDEX,NOFOLLOW a better alternative?"

Short answer: No, don't block them using robots.txt. Learn more about duplicate content here: google.com/support/webmasters/...
автотехномузыкадетское