What's the best way to eliminate duplicate page content caused by blog archives?
-
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive.
Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct?
Any other suggestions to alleviate this pesky duplicate page content issue?
-
I think I understand better now.
Use the noindex,follow tag on the content you don't want included in the search index.
If you are using Wordpress then you should check out http://yoast.com/wordpress/seo/
-
The hypothetical blog posting I want to have indexed is...
www.example.com/blog/2011/10/19
The first sentence of this blog posting is: "Jim and Janice jumped joyfully to Jackson."
I go out to google and search "Jim and Janice jumped joyfully to Jackson." There are 7 results. The first result is the blog posting I want indexed. The 2nd - 7th results are archive pages from my blog. Let's call one of those archive pages...
So, residing on this archive page are all of my postings from October 2011 including Jim and Janice's. Thus, there appears to be a ton of duplicate content on my site.
If I implement a canonical tag on the archive page, won't this archive page be referred to the blog posting I want indexed?
If so, that won't work. I need the blog posting and all the archive pages to remain as is but I don't want the archive pages to be indexed or show up as duplicate content.
Thoughts?
-
The hypothetical blog posting I want to have indexed is...
www.example.com/blog/2011/10/19
The first sentence of this blog posting is: "Jim and Janice jumped joyfully to Jackson."
I go out to google and search "Jim and Janice jumped joyfully to Jackson." There are 7 results. The first result is the blog posting I want indexed. The 2nd - 7th results are archive pages from my blog. Let's call one of those archive pages...
So, residing on this archive page are all of my postings from October 2011 including Jim and Janice's. Thus, there appears to be a ton of duplicate content on my site.
If I implement a canonical tag on the archive page, won't this archive page be referred to the blog posting I want indexed?
If so, that won't work. I need the blog posting and all the archive pages to remain as is but I don't want the archive pages to be indexed or show up as duplicate content.
Thoughts?
-
I agree with James, best to implement canonical tags.
-
The best way would be to implement canonical tags on these pages,
Example from Google:
http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Car Dealership website - Duplicate Page Content Issues
Hi, I am currently working on a large car dealership website. I have just had a Moz crawl through and its flagging a lot of duplicate page content issues, these are mostly for used car pages. How can I get round this as the site stocks many of the same car, model, colour, age, millage etc. Only unique thing about them is the reg plate. How do I get past this duplicate issue if all the info is relatively the same? Anyone experienced this issue when working on a car dealership website? Thank you.
Technical SEO | | karl621 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Is new created page's pagerank 1 ?
Hey I just want to know,
Technical SEO | | atakala
If I create a web page, is the pagerank of the page would be 1?1 -
Issue: Duplicate Pages Content
Hello, Following the setting up of a new campaign, SEOmoz pro says I have a duplicate page content issue. It says the follwoing are duplicates: http://www.mysite.com/ and http://www.mysite.com/index.htm This is obviously true, but is it a problem? Do I need to do anything to avoid a google penalty? The site in question is a static html site and the real page only exsists at http://www.mysite.com/index.htm but if you type in just the domain name then that brings up the same page. Please let me know what if anything I need to do. This site by the way, has had a panda 3.4 penalty a few months ago. Thanks, Colin
Technical SEO | | Colski0 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0 -
Best Way to Handle - International Content - Different Language
Our site currently is focused in the USA and the entire site is in the English language. We have considered broadening our scope to include content from foreign countries - i.e. Brazil. What is the best way to approach this -- can we use our existing domain and just have a specific section of the site that is dedicated to a particular Country with content translated into that Country's predominant language? OR could this create SEO issues -- having a domain with both English and some other language? Would it be better to have this on a totally different domain with Country extension? This is totally foreign territory for me - bad pun intended. Any advice, help would be appreciated. Thanks. Matt
Technical SEO | | MWM37720 -
Duplicate content
I have to sentences that I want to optimize to different pages for. sentence number one is travel to ibiza by boat sentence number to is travel to ibiza by ferry My question is, can I have the same content on both pages exept for the keywords or will Google treat that as duplicate content and punish me? And If yes, where goes the limit/border for duplicate content?
Technical SEO | | stlastla0