One of Google’s great crusades is against duplicate content: no one would expect, therefore, that a product like Blogger (cfr. “Google buys Pyra Labs”) would generate a duplicate content for each written comment on the blog (for those interested , this is a partial solution to the problem).
A single site (as my personal blog) can also generate 100 pages of duplicate content … to that I wondered: how many duplicate content Blogger generates in total?
In short, because of this bug Google indexes (but also of the other millions of search) have been inundated with more than 6 million pages unnecessary. Number, which, in my opinion, should be higher, and is expected to grow for a number of reasons:
- the number of duplicates created depends on the number of comments that blogs based on Blogger receive … and this is increasing;
- from what I could see on my blog, not all of the comments pages are indexed, at least not right away. Gradually, however, their number is increasing (even while maintaining the same number of comments).
We should, in my opinion, to put a remedy to all this,
- changing the permalink comments on its blogging platform and
- blocking indexing by robots.txt (which only he, or whoever, can change), or implementing, even better, a 301 redirect, do not lose any links that duplicates may have taken.
… And all this will not only benefit the owners of blogs, but also of the search engines (and therefore itself), which will see their indexes cleaned from all this junk.
It should, above all, correct the bugs as quickly as possible, before it spreads the voice and make a bad impression. 😉
More articles here…