Having said that, the only problem in using robots.txt in eliminating duplicate content is some people may be linking to the page that is excluded. That would prevent these links from contributing to your website's search engine ranking.
New Year, New Attribution Model | ClickZ - 0 views
ClickEquations Video Tour and Demo - 0 views
SEO Interview with Matt Cutts - 0 views
« First
‹ Previous
2981 - 3000 of 3586
Next ›
Last »
Showing 20▼ items per page