Local Search | Website Issue with Duplicate Content (97 pages)
-
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page.
Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page.
Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
-
Thank you Miriam.
-
Thanks Miriam!
-
Hi Todd, I'm endorsing Kevin's response as a best answer on this, but also want to add that it will be easier on the client if he makes a plan now to begin to improve the content of key pages rather than scramble to do so after rankings suddenly fall off. Local rankings are in a constant state of flux...drops can happen so swiftly. An ounce of prevention is worth a pound of cure. I would identify the 10 most important cities and write unique content for them, then move on to the 10 next-most important and so on. Do it in a way the client can afford, at a velocity you can manage.
-
Good morning Kevin - most of the individual pages receive little traffic. Thank you for your advice and feedback.
-
Hi Daniel - thank you for response and advice!
-
Hi Todd,
How much traffic is each of those pages getting? Chances are if you look at them over 50% of them are getting little if any traffic. As you know, ranking on the first page in local search really doesn't mean much. You need to be in the top 3 (or 3-5 if maps is displaying results).
My advice would be to help the client focus on the best areas (Based on traffic, demographics, distance, etc.) and the ones that are currently driving traffic then create unique content for each of those pages. This could also bring down the too many links per page signal.
I did this with one of my clients and their rank improved to where they were #1 & #2 for their top 10 areas that were driving 90% of their traffic. If they want to continue targeting all 97 each page should have unique content. Their rankings will definitely improve if done right.
Anyways, I know it's a balancing act of the best strategy and what the clients budget will allow you to do so in the end you have to make the best decision.
Cheers,
Kevin
-
I myself have done this for many clients. I have used a generic paragraph with near duplicate content on over 3000+ pages for one client and it has been going strong for many years. I have also tested websites with near 100% duplicate body text with exception to title, description, h1, image alts and they are ranking good as well with no problems.
I would advise the client of the risk of having duplicate content. You could use textbroker to write some content for each page at around $5 each just to be safe and to feel comfortable moving forward with SEO.
Most of my clients have come to me from other SEO's and I'm always wondering what will drop off when I optimize something because the work was clearly black/grey hat. The good thing is they know the value of SEO already and agree to pay to just fix old issues before moving forward most of the time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content & url's for e-commerce website
Hi, I have an e-commerce site where I sell greeting cards. Products are under different categories (birthday, Christmas etc) with subcategories (for Mother, for Sister etc) and same product can be under 3 or 6 subcategories, for example: url: .../greeting-cards/Christmas/product1/for-mother
Technical SEO | | jurginga
url:.../greeting-cards/Christmas/product1/for-sister
etc On the CMS I have one description record per each card (product1) with multiple subcategories attached that naturally creates URLs for subcategories. Moz system (and Google for sure) picks these urls (and content) as duplicated.
Any ideas how to solve this problem?
Thank you very much!0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
SEOMOZ and non-duplicate duplicate content
Hi all, Looking through the lovely SEOMOZ report, by far its biggest complaint is that of perceived duplicate content. Its hard to avoid given the nature of eCommerce sites that oestensibly list products in a consistent framework. Most advice about duplicate content is about canonicalisation, but thats not really relevant when you have two different products being perceived as the same. Thing is, I might have ignored it but google ignores about 40% of our site map for I suspect the same reason. Basically I dont want us to appear "Spammy". Actually we do go to a lot of time to photograph and put a little flavour text for each product (in progress). I guess my question is, that given over 700 products, why 300ish of them would be considered duplicates and the remaning not? Here is a URL and one of its "duplicates" according to the SEOMOZ report: http://www.1010direct.com/DGV-DD1165-970-53/details.aspx
Technical SEO | | fretts
http://www.1010direct.com/TDV-019-GOLD-50/details.aspx Thanks for any help people0 -
Duplicate Page Content
I've got several pages of similar products that google has listed as duplicate content. I have them all set up with rel="prev" and rel="next tags telling google that they are part of a group but they've still got them listed as duplicates. Is there something else I should do for these pages or is that just a short falling of googles webmaster tools? One of the pages: http://www.jaaronwoodcountertops.com/wood-countertop-gallery/walnut-countertop-9.html
Technical SEO | | JAARON0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0 -
Strange duplicate content issue
Hi there, SEOmoz crawler has identified a set of duplicate content that we are struggling to resolve. For example, the crawler picked up that this page www. creative - choices.co.uk/industry-insight/article/Advice-for-a-freelance-career is a duplicate of this page www. creative - choices.co.uk/develop-your-career/article/Advice-for-a-freelance-career. The latter page's content is the original and can be found in the CMS admin area whilst the former page is the duplicate and has no entry in the CMS. So we don't know where to begin if the "duplicate" page doesn't exist in the CMS. The crawler states that this page www. creative-choices.co.uk/industry-insight/inside/creative-writing is the referrer page. Looking at it, only the original page's link is showing on the referrer page, so how did the crawler get to the duplicate page?
Technical SEO | | CreativeChoices0 -
Duplicate Page Content and Title for product pages. Is there a way to fix it?
We we're doing pretty good with our SEO, until we added product listing pages. The errors are mostly Duplicate Page Content/Title. e.g. Title: Masterpet | New Zealand Products MasterPet Product page1 MasterPet Product page2 Because the list of products are displayed on several pages, the crawler detects that these two URLs have the same title. From 0 Errors two weeks ago, to 14k+ errors. Is this something we could fix or bother fixing? Will our SERP ranking suffer because of this? Hoping someone could shed some light on this issue. Thanks.
Technical SEO | | Peter.Huxley590