Duplicate content issue
-
I have recently built a site that has a main page intended to rank for national coverage. This site also has a number of pages targeted at local searches, these pages are slight variations of each other with town specific keywords. Does anyone know if google will see this as spam and quarantine my site from ranking? Thanks
-
You want to rank for local searches right? Now the question is do you have physical presence in those places? If not, by making city specific pages just to get rank for will definitely invite penalty sooner or later. Think about the customers first and not the search engines.
_Now, if you do have branches in those cities, you can create Google Local listing, can have separate landing pages for them given the fact that those pages say something unique about the business etc. Do not add rehash content that no one is going to add. Focus on adding value to users’ experience. _
-
Creating a site with multiple landing pages targeted to different regions is not new, therefore Google has made updates to try attempt to stop sites with low quality from capitalizing on localized keywords (miami keyword, tuscon x, san diego x, etc) where x is your main keyword.
What this means is that you need to do more than simply duplicate your pages and mix up the keywords, replace the local terms and create new URLs and titles/descriptions. What you should do is create completely unique copy, dynanic content and/or user engagement, local citations will help each landing page, and make sure to get local backlinks to each landing page.
-
While I certainly don't want to pretend to be able to predict anything Google might do, to me, the fact that you are thinking about this as being a potential problem should be enough to make you consider some options. Depending on how many pages you have, it may not be that difficult to get really truly original content produced for those other pages.
Will Google choose not to index you? I have no idea.
My guess is that you get indexed, but may not rank very high if the content is substantially similar on all of those pages. You might get stuck in the proverbial "sandbox." (ranked so low that no one can find you).
My gut says, if you have to ask "is this duplicate content?" It probably is, so make it unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap and Privacy Policy marked for duplicate content?
On a recent crawl, Moz flagged a page of our site for duplicate content. However, the pages listed are our sitemap and our privacy policy -- both very different: http://elearning.smp.org/sitemap/ http://elearning.smp.org/privacy-policy/ What is our best option to address this issue? I had considered a noindex tag on the privacy policy page, but since we have enabled user insights in Google Analytics we need to have the privacy policy displayed and I worry that putting a noindex on the page would cause problems later.
Web Design | | calliek0 -
Best Practices for Leveraging Long Tail Content & Gated Content
Our B2B site has a lot of of long form content (e.g., transcriptions from presentations and webinars). We'd like to leverage the long tail SEO traffic driven to these pages and convert those visitors to leads. Essentially, we'd like Google to index all this lengthy, keyword-rich content AND we'd like to put up a read gate that requires users to register before viewing the full article. This is a B2B site, and the goal is to generate leads. Some considerations and questions: How much of the content to share before requiring registration? Ask too soon and it's a terrible user experience, give too much away and our business objectives are not met. Design-wise, what are good ways to do this? I notice Moz uses a "teaser" to block Mozinar content, and I've seen modals and blur bars on other sites. Any gotchas that Google doesn't like that we should be aware of? Trying to avoid anything that might seem like cloaking. Is it better to split the content across several pages (split a 10K word doc across 10 URLs and include a read gate on each) or keep to one page? Thank you!
Web Design | | Allie_Williams0 -
Joomla! Site Returning 12000+ Duplicate Content Errors! W Image
(I do award "Good Answer" and "thumbs up" to responses as earned) I have tried to ask this question previously (maybe not correctly). I have a client that I am doing the on and offsite optimization and the MOZ report is kicking back major errors. I have examples below. They all seem to relate directly to rokecwid and ECWID. Is there ANY solution to fix this? Is this hurting the rankings Since I didn't build the site, I am having to tell the website company what to do when I need changes made to code, etc... I am also not very proficient with Joomla! and my web engineer is one of those closet coders (the best kind to have) and doesn't communicate in a way that a "layman" could understand. He pointed out several issues with the HTML but I don't think that is related to this below. Can anyone tell me what to tell the web company that built this site to get rid of these errors? A very small sample of the urls w errors:
Web Design | | Atlanta-SMO
http://www.metroboltmi.com/shop-spareparts?
Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560097
1 14 1 http://www.metroboltmi.com/shop-spareparts?
Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560098
1 1 0 http://www.metroboltmi.com/shop-spareparts?
Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560099
1 14 1 http://www.metroboltmi.com/shop-spareparts?
Itemid=218&option=com_rokecwid&view=ecwid&ecwid_category_id=3560100
1 14 1 SEOMOZErrors_zps3a1ce2a2.png0 -
Pagenation - Crawl Issue
Hi,
Web Design | | semvibe
We have a site with large number of products (6000 +) under each categories and so we have made a page under each category to list out all products (View all page), which lists out product in pagenation setup built on Ajax. The problem is only our 1st page is crawlable and all the other pages beyond 1st page remains hidden,
We need make all our pagenation URL’s crawlable, our requirements are we never want a change in URL as user goes to next page, want to show the user the same URL for all the pagenation numbers. Is there a perfect solution?0 -
Redirecting duplicate pages
For whatever reason, X-cart creates duplicates of our categories and articles so that we have URLs like this www.k9electronics.com/dog-training-collars
Web Design | | k9byron
www.k9electronics.com/dog-training-collars/ or http://www.k9electronics.com/articles/anti-bark-collar
http://www.k9electronics.com/articles/anti-bark-collar/ now our SEO guy says that we dont have to redirect these because google is "smart enough" to know they are the same, and that we should "leave it as-is". However, everything I have read online says that google sees this as dupe content and that we should redirect to one or the other / or no /, depending on which most of our internal links already point to, which is with a slash. What should we do? Redirect or leave it as is? Thanks!0 -
How can i write content rich descriptions?
we have recently started using seomoz. how can i make descriptions more content rich?
Web Design | | WCGAdmin0 -
Duplicate content
Please help me to solve a problem with duplicate content errors.
Web Design | | NadiaFL
I received report today that 53 pages (all my pages) are market as duplicate
content. Yes, they are all about Oasis of the Seas but about different aspects
of it. Question is: how I can handle this error in report? Thank you!0 -
Has Anyone Had Issues With ASP.NET 4.0 URL Routing?
I'm seeing some odd results in my SEOMOZ results with a new site I just released that is using the ASP.NET 4.0 URL routing. I am seeing thousands(!) of duplicate results, for instance, because the crawl has uncovered something like this: http://www.mysite.com/
Web Design | | TroyCarlson
http://www.mysite.com/default.aspx (so far, so good, though I wish it wouldn't show both)
http://www.mysite.com/default.aspx/about/ (what the heck -?)
http://www.mysite.com/default.aspx/about/about/ (WTF!?)
http://www.mysite.com/default.aspx/about/about/products/ (and on and on ad infinitum) I'm also seeing problems pop up in my sitemap because extensionless urls have an odd "eurl.axd/abunchofnumbersgohere" appended to the end of every address which is breaking links. sigh Buyer beware. I've found articles that discuss the "eurl.axd" issue here and there (this one seems very good), but nothing about the weird crawl issue I outlined above. Any advice?0