How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
-
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type>
These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel.
The problem that we are running into is that the pages are showing up as dupe content.
One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type>
My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type>
One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there.
Are there other potential solutions to this?
-
Thanks for getting back to me!
Even if you do the dynamic meta data, that does not sound like a lot of duplication can be avoided between the 30,000 city/state pages.
That's the reason we are considering this new strategy, where we have essentially 300 unique pages, but we dynamically generate the city/state.
Is the content on these pages unique ? I mean the set of products returned, are they just unique collections or what ?
The content is unique, to a point. The pricing varies by region, but the actual products are the same.
You said these pages are showing up as duplicate content ? Where are they showing as duplicate content ?
They are showing up as dupe content on the seomoz report. Not sure if that's what you mean?
Do your stronger pages, homepage, category pages rank for your head keywords ? Do they rank very very well ?
Not sure what you're asking exactly. How do I find out what my head keywords are?
It really depends upon the overall domain authority and what other stuff is going on your website.
Over PR is about a 5 right now.
-
Even if you do the dynamic meta data, that does not sound like a lot of duplication can be avoided between the 30,000 city/state pages.
Is the content on these pages unique ? I mean the set of products returned, are they just unique collections or what ?
You said these pages are showing up as duplicate content ? Where are they showing as duplicate content ?
Do your stronger pages, homepage, category pages rank for your head keywords ? Do they rank very very well ?
It really depends upon the overall domain authority and what other stuff is going on your website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does a page with a canonical for another domain impact SEO?
Hi, We have a requirement to host files that contains .html, .css, .js, and .pdf files externally on AWS S3 bucket. We have a landing page on our site that contains a link to those external links (i.e. pdf). On our site's (hosted on Drupal), landing page we already have a canonical link for the current landing page. On the .html file which is hosted externally, we were thinking to add the same canonical link that exists for the landing page so that search engines will go to the externally available .html file and interpret that the externally hosted file is related to our landing page. I was wondering if this is an acceptable solution without any SEO penalty. If there is a penalty, what would be the alternative solution to this so we can host files externally and drive most of the traffic to our landing page? Example Landing page: absolute url = https://www.site-domain.com/page-url ...... Externally available .html file (static) ......
Intermediate & Advanced SEO | | KendallHershey0 -
How to create AMP Pages for product website?
How to create AMP Pages for product website? I mean we can create it easily when we have wordpress through plugin, what about when we have millions of pages, It would be too tedious to create amp version of every page. So, is there any alternative way to create amp version?
Intermediate & Advanced SEO | | sachin.kaushik0 -
Is it possible to have good SEO without links and with only quality content?
Is it possible to have good SEO without links and with only quality content? Have you any experience?
Intermediate & Advanced SEO | | Alex_Moravek2 -
Why the archive sub pages are still indexed by Google?
Why the archive sub pages are still indexed by Google? I am using the WordPress SEO by Yoast, and selected the needed option to get these pages no-index in order to avoid the duplicate content.
Intermediate & Advanced SEO | | MichaelNewman1 -
Duplicate peices of content on multiple pages - is this a problem
I have a couple of WordPress clients with the same issue but caused in different ways: 1. The Slash WP theme which is a portfolio theme, involves setting up multiple excerpts of content that can then be added to multiple pages. So although the pages themselves are not identical, there are the same snippets of content appearing on multiple pages 2. A WP blog which has multiple categories and/or tags for each post, effectively ends up with many pages showing duplicate excerpts of content. My view has always been to noindex these pages (via Yoast), but was advised recently not to. In both these cases, even though the pages are not identical, do you think this duplicate content across multiple pages could cause an issue? All thoughts appreciated
Intermediate & Advanced SEO | | Chammy0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0 -
Having a hard time with duplicate page content
I'm having a hard time redirecting website.com/ to website.com The crawl report shows both versions as duplicate content. Here is my htaccess: RewriteEngine On
Intermediate & Advanced SEO | | cgman
RewriteBase /
#Rewrite bare to www
RewriteCond %{HTTP_HOST} ^mywebsite.com
RewriteRule ^(([^/]+/)*)index.php$ http://www.mywebsite.com/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L] I added the last 2 lines after seeing a Q&A here, but I don't think it has helped.0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0