50+ duplicate content pages - Do we remove them all or 301?
-
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much).
Should we 301 all 50 of the URLs to one URL or should we just completely get rid of all the pages?
Are there any steps to take when completely removing pages completely? (submit sitemap to google webmaster tools, etc)
thanks!
-
I would keep them on the same URLs and explain the dupe content risk to client. Then client can decide to upgrade them himself, have you source new content or allow them to run "as is".
I know of a lot of sites that have pages like this that are working well and make nice money.
-
They have gotten some OK traffic. They're being clicked, but it's the exact same on each page.
There are no links coming in to any of them, however. The URLs themselves across the site are pretty weird, so I will be doing a complete restructuring of those as well.
Based on the traffic they've gotten, I agree, it would be good to do unique content for each state, but this obviously wouldn't be able to get completed for quite some time.
With all that in mind, what should we do?
-
We are working on a site that has 50+ pages that all have duplicate content (1 for each state, pretty much).
There are a lot of sites that make a lot of money by having one page for each state (or county or country or city or whatever). Although it is labor intensive to develop content for fifty different pages it can be done eloquently.
There are also a lot of successful sites that get away with fifty pages of cookie cutter content (Replace Alabama with Alaska, etc. on every page).
I would see what kind of traffic the site is currently getting on those pages, how many sales they are making, what visitors are doing on them, do they have any links?
These pages could be highly successful and if that is the case, II would work to improve them instead of removing them.... and if they have an inbound links at present I might improve them on the same URL rather than removing them to build on new URLs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on report
Hi, I just had my Moz Campaign scan 10K pages out of which 2K were duplicate content and URL's are http://www.Somesite.com/modal/register?destination=question%2F37201 http://www.Somesite.com/modal/register?destination=question%2F37490 And the title for all 2K is "Register" How can i deal with this as all my pages have the register link and login and when done it comes back to the same page where we left and that it actually not duplicate but we need to deal with it propely thanks
Technical SEO | | mtthompsons0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Duplicate page errors from pages don't even exist
Hi, I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages don't even exist. My website has around 40-50 pages but SEO report shows that 375 pages have been crawled. My guess is that the errors have something to do with my recent htaccess configuration. I recently configured my htaccess to add trailing slash at the end of URLs. There is no internal linking issue such as infinite loop when navigating the website but the looping is reported in the SEOmoz's report. Here is an example of a reported link: http://www.mywebsite.com/Door/Doors/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/GlassNow-Services/GlassNow-Services/Glass-Compliance-Audit/ btw there is no issue such as crawl error in my Google webmaster tool. Any help appreciated
Technical SEO | | mmoezzi0 -
301 redirect all 404 pages
Hi I would like to have a second opinion on this. I am working on an ecommerce website that they 301 redirect all 404 pages (including the URLs entered incorrectly) to the “All categories page”. Will this have any negative SEO impact?
Technical SEO | | iThinkMedia0 -
Worpress Tags Duplicate Content
I just fixed a tags duplicate content issue. I have noindexed the tags. Was wondering if anyone has ever fixed this issue and how long did it take you to recover from it? Just kind of want to know for a piece of mind.
Technical SEO | | deaddogdesign0 -
Duplicate Page Content and Titles
A few weeks ago my error count went up for Duplicate Page Content and Titles. 4 errors in all. A week later the errors were gone... But now they are back. I made changes to the Webconfig over a month ago but nothing since. SEOmoz is telling me the duplicate content is this http://www.antiquebanknotes.com/ and http://www.antiquebanknotes.com Thanks for any advise! This is the relevant web.config. <rewrite><rules><rule name="CanonicalHostNameRule1"><match url="(.*)"><conditions><add input="{HTTP_HOST}" pattern="^www.antiquebanknotes.com$" negate="true"></add></conditions>
Technical SEO | | Banknotes
<action type="Redirect" url="<a href=" http:="" www.antiquebanknotes.com="" {r:1"="">http://www.antiquebanknotes.com/{R:1}" />
</action></match></rule>
<rule name="Default Page" enabled="true" stopprocessing="true"><match url="^default.aspx$"><conditions logicalgrouping="MatchAll"><add input="{REQUEST_METHOD}" pattern="GET"></add></conditions>
<action type="Redirect" url="/"></action></match></rule></rules></rewrite>0 -
Ranked on Page 1, now between page 40-50... Please help!
My site, http://goo.gl/h0igI was ranking on page one for many of our biggest keywords. All of a sudden, we completely fell off. I believe I'm down somewhere between page 40-50. I have no warning or error messages in webmaster tools. Can anyone please help me identify what the problem is? This is completely unexpected and I don't know how to fix it... Thanks in advance
Technical SEO | | Prime850 -
Duplicate content and tags
Hi, I have a blog on posterous that I'm trying to rank. SEOMoz tells me that I have duplicate content pretty much everywhere (4 articles written, 6 errors at the last crawl). The problem is that I tag my posts, and apparently SEOMoz thinks that it's duplicate content only because I don't have so many posts, so pages end up being very very similar. What can I do in these situations ?
Technical SEO | | ngw0