How to Avoid Duplicate Page Content errors when using Wordpress Categories & Tags?
-
I get a lot of duplicate page errors on my crawl diagnostics reports from 'categories' and 'tags' on my wordpress sites.
The post is 1x link and then the content is 'duplicated' on the 'category' or 'tag' that is added to the page. Should I exclude the tags and categories from my sitemap or are these issues not that important?
Thanks for your help
Stacey
-
Hi Stacey, I have a lot of Wordpress Sites and there is not "the one solution" 4all, but in general I use categories (dont matter how many for a post, it wont exist more than one time).
I use Tags, but with the YOAST Plugin I make them noindex, and (because it wouldn't be usefull for noindex pages) they aren't in a sitemap.
Just kick them out of a sitemap wouldn't bring that much affect, when the post links to the tags.There are possibilities to use both and don't change anything. Depends on how many posts you have per tag, if you display 100% of the post in a tag or only a few sentences, maybe a complete other description. Thatswhy there is no "one solution 4 all" but that one is a good solution in many (i guess most) cases.
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on SearchResults.asp
hi guys. I'm currently working through the reported crawl errors in Moz Analytics, but an unsure what to do about some of them. for example... Searchresults.asp?search=frankie+says+relax is showing as having duplicate page content and page title as SearchResults.asp?searching=Y&sort=13&search=Frankie+Says+Relax&show=24 There's all sorts of searchresults.asp page being flagged. Is this something i can safely ignore or is it something i should endeavour to rectify? I'm also getting errors reported on shoppingcart.asp pages as well as pindex.asp (product index). I'm thinking i should maybe add disallow/ shoppingcart.asp to my robots text file, but am unsure as to whether i should be blocking robots from the search results pages and product index (which is essentially a secondary sitemap). Any advice would be greatly appreaciated. Thanks, Dave 🙂
Moz Pro | | giddygrafix0 -
How to remove 404 pages wordpress
I used the crawl tool and it return a 404 error for several pages that I no longer have published in Wordpress. They must still be on the server somewhere? Do you know how to remove them? I think they are not a file on the server like an html file since Wordpress uses databases? I figure that getting rid of the 404 errors will improve SEO is this correct? Thanks, David
Moz Pro | | DJDavid0 -
The pages that add robots as noindex will Crawl and marked as duplicate page content on seo moz ?
When we marked a page as noindex with robots like {<meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">noindex</a>" />} will crawl and marked as duplicate page content(Its already a duplicate page content within the site. ie, Two links pointing to the same page).So we are mentioning both the links no need to index on SE.But after we made this and crawl reports have no change like it tooks the duplicate with noindex marked pages too. Please help to solve this problem.
Moz Pro | | trixmediainc0 -
I want to create a report of only de duplicate content pages as a csv file so i can create a script to canonicalize them.
I want to create a report of only de duplicate content pages as a csv file so i can create a script to canonicalize them. So i get something like: http://example.com/page1, http://example.com/page2, http://example.com/page3, http://example.com/page4, Because I now have to open each in "Issue: Duplicate Page Content", and this takes a lot of time. The same for duplicate page title.
Moz Pro | | nvs.nim0 -
How long will it take for Page Rank (or Page Authority) to flow via a 301 redirect?
I've recently redeveloped a static site using WordPress and have created 301 redirects for the original urls to the new urls. I know I won't get all the value passed via the 301, but I'm hoping some will. Any idea how long this may take? It's been nearly a month since the changeover so wondering if it would be weeks, months or more?
Moz Pro | | annomd0 -
Erroneous "Weekly Keyword Ranking & On-page Optimization Report" For Campaign
Hi, I just received an email alert from Seomoz telling me my "Weekly Keyword Ranking & On-page Optimization Report " for the period 11/06/12 - 11/13/12 is ready. It is just a copy of the previous report though, all rankings and ranking changes are the same. What is up with that? Best regards, Martin
Moz Pro | | TalkInThePark0 -
Only one page has been crawled
I am running a campaing for three weeks now and first two crawls was ok but the last one is showing only one page crawled. the subdomain I am tracking is: www.cubaenmiami.com I have everything correct in my site. Regards Alex
Moz Pro | | esencia0 -
Tool for scanning the content of the canonical tag
Hey All, question for you. What is your favorite tool/method for scanning a website for specific tags? Specifically (as my situation dictates now) for canonical tags? I am looking for a tool that is flexible, hopefully free, and highly customizable (for instance, you can specify the tag to look for). I like the concept of using google docs with the import xml feature but as you can only use 50 of those commands at a time it is very limiting (http://www.distilled.co.uk/blog/seo/how-to-build-agile-seo-tools-using-google-docs/). I do have a campaign set up using the tools which is great! but I need something that returns a response faster and can get data from more than 10,000 links. Our cms unfortunately puts out some odd canonical tags depending on how a page is rendered and I am trying to catch them quickly before it gets indexed and causes problems. Eventually I would also like to be able to scan for other specific tags, hence the customizable concern. If we have to write a vb script to get it into excel I suppose we can do that. Cheers, Josh
Moz Pro | | prima-2535090