Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does schema.org assist with duplicate content concerns
-
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites.
Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content?
These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us.
Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns.
Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
-
Thanks Takeshi - useful comments.
-
That's an interesting question. Semantic markup can be used to help Google understand what different pages are (i.e. tag pages), but it doesn't really solve the problems caused by duplicate content, namely:
- Thin Content - Tag pages and other similar pages are thin content, with not much utility for the user, and are probably not going to rank well in Google anyway. Even if they do rank, they won't convert as well as your main pages.
- Keyword Cannibalization - Even if your tag pages & duplicate content rank, they could potentially outrank your main content, leading to lower conversions.
- Panda - Too many thin content pages can lower Google's opinion of your site as a whole, leading to a Panda penalty.
Given the problems above, semantic markup doesn't really help with any of them. Semantic markup can help Google understand what a tag page is, but that doesn't mean you want to have that page indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema.org Article, itemprop keyword, what is it?
I've wanted to know the answer to this for a couple of years now and haven't found anyone ever talking about it. So here goes ... For schema.org markup on articles, http://schema.org/Article there's an itemprop for keywords: http://schema.org/keywords keywords
On-Page Optimization | May 3, 2017, 11:37 PM | SteveRDM
Canonical URL: http://schema.org/keywords
Keywords or tags used to describe this content. Multiple entries in a keywords list are typically delimited by commas. What's that do? Like if I use that markup with an article I publish on my site, will that get those words given that property keyword value? Will that affect SEO value? Do those replace what metatag keywords used to be? Or are they just like what metatag keywords are these days, no real value?0 -
Duplicate Content Re: Product listing body copy on Website, Amazon & Ebay - issues ?
Hi Is it ok to have identical product body copy on market/platform listings same as the websites product listings ? In this case the products are the websites/own brand products (all pages canonicalised), so i take it shouldn't cause any issues or are you supposed to differentiate the product body copy on marketplace listings ? Im asking re seo reasons All Best Dan
On-Page Optimization | Nov 17, 2014, 7:56 AM | Dan-Lawrence0 -
Duplicate Content on Event Pages
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
On-Page Optimization | Oct 18, 2013, 7:01 PM | mattdinbrooklyn0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | Jul 16, 2013, 11:02 PM | JohnHuynh0 -
Page content length...does it matter?
As I begin developing my website's content, does it matter how long or short the actual text found in the is? I heard someone say before "a minimum of 250 words", but is that true? If so, what is the maximum length I should use?
On-Page Optimization | Nov 21, 2012, 5:17 PM | wlw20090 -
Schema.org for news websites?
So as of late I have been on something of a mission to mark up my news website with as much accurate and detailed Schema and Open Graph data as possible, in order to not only allow the search engines to understand my content properly, but also to ensure everything appears in the most ideal fashion when linked to from Facebook, Google+, etc. Here is an example of a typical article page: http://www.nerdscoop.net/technology/video-games-459 As you'll see I currently have news posts marked up as article because that is essentially exactly what they are, but is there a better way to emphasise that they are news rather than just generic articles? My second question is regarding the category pages and the home page. How would be best to mark these up? With OG the task is fairly simple, because I can specify the homepage as being a website, but not so with Schema from what I can see. Either way, this is an interesting subject to me and I look forward to any discussion as a result. Thanks for looking.
On-Page Optimization | May 13, 2012, 10:37 AM | HalogenDigital0 -
Best practice for franchise sites with duplicated content
I know that duplicated content is a touchy subject but I work with multiple franchise groups and each franchisee wants their own site, however, almost all of the sites use the same content. I want to make sure that Google sees each one of these sites as unique sites and does not penalize them for the following issues. All sites are hosted on the same server therefor the same IP address All sites use generally the same content across their product pages (which are very very important pages) *templated content approved by corporate Almost all sites have the same design (A few of the groups we work with have multiple design options) Any suggestions would be greatly appreciated. Thanks Again Aaron
On-Page Optimization | May 27, 2011, 9:33 AM | Shipyard_Agency0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | Jun 2, 2013, 5:08 PM | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5