Local Search | Website Issue with Duplicate Content (97 pages)
-
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page.
Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page.
Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
-
Thank you Miriam.
-
Thanks Miriam!
-
Hi Todd, I'm endorsing Kevin's response as a best answer on this, but also want to add that it will be easier on the client if he makes a plan now to begin to improve the content of key pages rather than scramble to do so after rankings suddenly fall off. Local rankings are in a constant state of flux...drops can happen so swiftly. An ounce of prevention is worth a pound of cure. I would identify the 10 most important cities and write unique content for them, then move on to the 10 next-most important and so on. Do it in a way the client can afford, at a velocity you can manage.
-
Good morning Kevin - most of the individual pages receive little traffic. Thank you for your advice and feedback.
-
Hi Daniel - thank you for response and advice!
-
Hi Todd,
How much traffic is each of those pages getting? Chances are if you look at them over 50% of them are getting little if any traffic. As you know, ranking on the first page in local search really doesn't mean much. You need to be in the top 3 (or 3-5 if maps is displaying results).
My advice would be to help the client focus on the best areas (Based on traffic, demographics, distance, etc.) and the ones that are currently driving traffic then create unique content for each of those pages. This could also bring down the too many links per page signal.
I did this with one of my clients and their rank improved to where they were #1 & #2 for their top 10 areas that were driving 90% of their traffic. If they want to continue targeting all 97 each page should have unique content. Their rankings will definitely improve if done right.
Anyways, I know it's a balancing act of the best strategy and what the clients budget will allow you to do so in the end you have to make the best decision.
Cheers,
Kevin
-
I myself have done this for many clients. I have used a generic paragraph with near duplicate content on over 3000+ pages for one client and it has been going strong for many years. I have also tested websites with near 100% duplicate body text with exception to title, description, h1, image alts and they are ranking good as well with no problems.
I would advise the client of the risk of having duplicate content. You could use textbroker to write some content for each page at around $5 each just to be safe and to feel comfortable moving forward with SEO.
Most of my clients have come to me from other SEO's and I'm always wondering what will drop off when I optimize something because the work was clearly black/grey hat. The good thing is they know the value of SEO already and agree to pay to just fix old issues before moving forward most of the time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content issue with ?utm_source=rss&utm_medium=rss&utm_campaign=
Hello,
Technical SEO | | Dinsh007
Recently, I was checking how my site content is getting indexed in Google and from today I noticed 2 links indexed on google for the same article: This is the proper link - https://techplusgame.com/hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims/ But why this URL was indexed, I don't know - https://techplusgame.com/hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims/?utm_source=rss&utm_medium=rss&utm_campaign=hideo-kojima-not-interested-in-new-silent-hills-revival-insider-claims Could you please tell me how to solve this issue? Thank you1 -
How do I prevent duplicate page title errors from being generated by my multiple shop pages?
Our e-commerce shop has numerous pages within the main shop page. Users navigate through the shop via typical pagination. So while there may be 6 pages of products it's all still under the main shop page. Moz keeps flagging my shop pages as having duplicate titles (ie shop page 2). But they're all the same page. Users aren't loading unique pages each time they go to the next page of products and they aren't pages I can edit. I'm not sure how to prevent this issue from popping up on my reports.
Technical SEO | | NiteSkirm0 -
Duplicated content & url's for e-commerce website
Hi, I have an e-commerce site where I sell greeting cards. Products are under different categories (birthday, Christmas etc) with subcategories (for Mother, for Sister etc) and same product can be under 3 or 6 subcategories, for example: url: .../greeting-cards/Christmas/product1/for-mother
Technical SEO | | jurginga
url:.../greeting-cards/Christmas/product1/for-sister
etc On the CMS I have one description record per each card (product1) with multiple subcategories attached that naturally creates URLs for subcategories. Moz system (and Google for sure) picks these urls (and content) as duplicated.
Any ideas how to solve this problem?
Thank you very much!0 -
Possible duplicate content issue with my Blog and archive pages . Any help greatly appreciated
Dear Mozzers, I have been looking at my news section on my eCommerce site and I think I may have a duplicate content issue and wanted some advice on whether I do and if so , how best I handle this. http://www.website.co.uk/news
Technical SEO | | PeteC12
http://www.website.co.uk/news/page:1
http://www.website.co.uk/news/page:2
http://www.website.co.uk/news/page:3
http://www.website.co.uk/news/limit:9999 (This is show all) I also have the ability of showing articles by month : http://www.website.co.uk/news/archive/2015/04 (April)
http://www.website.co.uk/news/archive/2015/03 (March)
http://www.website.co.uk/news/archive/2015/02 (Feb)
http://www.website.co.uk/news/archive/2015/01 (Jan) I am wondering if there's a duplicate issue here or not given that I also articles by month as well and if so how best I handle this.? I already do pagination on my news pages (page 1 , page 2) by using rel=next and rel=Prev but I don't have an canconical or anything as yet. I enclose a couple of links if this would help and would appreciate if someone could take a browse. I have a View All link on my homepage for for all news items - http://goo.gl/JPPIvQ I which have a different urls - March 2015 Articles - http://goo.gl/0O1wYD and April 2015 articles - http://goo.gl/GdW2oK On another note, These articles are also linked to from the relevant category landing pages on my website to help with SEO. I have not used H tags on the article links in my landing pages , just displaying the weblink back to the news article.I've done this to try and improve the PR and rankings of my landing pages. Just wondered if anyone has any comments as to whether thats a good or bad idea and whether I could improve it in any way - An example is here (scroll down the page to the pressure washing guides) - http://goo.gl/nnRE49 Thanks Pete0 -
Avoiding Duplicate Content in E-Commerce Product Search/Sorting Results
How do you handle sorting on ecommerce sites? Does it look something like this? For Example: example.com/inventory.php example.com/inventory.php?category=used example.com/inventory.php?category=used&price=high example.com/inventory.php?category=used&location=seattle If not, how would you handle this? If so, would you just include a no-index tag on all sorted pages to avoid duplicate content issues? Also, how does pagination play into this? Would it be something like this? For Example: example.com/inventory.php?category=used&price=high__ example.com/inventory.php?category=used&price=high&page=2 example.com/inventory.php?category=used&price=high&page=3 If not, how would you handle this? If so, would you still include a no-index tag? Would you include a rel=next/prev tag on these pages in addition to or instead of the no-index tag? I hope this makes sense. Let me know if you need me to clarify any of this. Thanks in advance for your help!
Technical SEO | | AlexanderAvery1 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
SEO MOZ report showing duplicate content pages with without ending /
Hello the SEOMOZ report is showing me I have a lot of duplicate content and then proceeds listing almost every page on my site as showing with a URL with an ending "/" and without. I checked my sitemap and only one version is there, the one with "/". I have a Wordpress site. Any recommendations ? Thanks.
Technical SEO | | dpaq20110 -
Is 100% duplicate content always duplicate?
Bit of a strange question here that would be keen on getting the opinions of others on. Let's say we have a web page which is 1000 lines line, pulling content from 5 websites (the content itself is duplicate, say rss headlines, for example). Obviously any content on it's own will be viewed by Google as being duplicate and so will suffer for it. However, given one of the ways duplicate content is considered is a page being x% the same as another page, be it your own site or someone elses. In the case of our duplicate page, while 100% of the content is duplicate, the page is no more than 20% identical to another page so would it technically be picked up as duplicate. Hope that makes sense? My reason for asking is I want to pull latest tweets, news and rss from leading sites onto a site I am developing. Obviously the site will have it's own content too but also want to pull in external.
Technical SEO | | Grumpy_Carl0