Should I block duplicate pages using robots.txt?
Channel:
Subscribers:
751,000
Published on ● Video Link: https://www.youtube.com/watch?v=CJMFYpYQZ0c
Halfdeck from Davis, CA asks: "If Google crawls 1,000 pages/day, Googlebot crawling many dupe content pages may slow down indexing of a large site. In that scenario, do you recommend blocking dupes using robots.txt or is using META ROBOTS NOINDEX,NOFOLLOW a better alternative?"
Short answer: No, don't block them using robots.txt. Learn more about duplicate content here: http://www.google.com/support/webmasters/bin/answer.py?answer=66359
Other Videos By Google Search Central
Tags:
google
seo
robots.txt
duplicate content