Duplicate content issue
-
I have recently built a site that has a main page intended to rank for national coverage. This site also has a number of pages targeted at local searches, these pages are slight variations of each other with town specific keywords. Does anyone know if google will see this as spam and quarantine my site from ranking? Thanks
-
You want to rank for local searches right? Now the question is do you have physical presence in those places? If not, by making city specific pages just to get rank for will definitely invite penalty sooner or later. Think about the customers first and not the search engines.
_Now, if you do have branches in those cities, you can create Google Local listing, can have separate landing pages for them given the fact that those pages say something unique about the business etc. Do not add rehash content that no one is going to add. Focus on adding value to users’ experience. _
-
Creating a site with multiple landing pages targeted to different regions is not new, therefore Google has made updates to try attempt to stop sites with low quality from capitalizing on localized keywords (miami keyword, tuscon x, san diego x, etc) where x is your main keyword.
What this means is that you need to do more than simply duplicate your pages and mix up the keywords, replace the local terms and create new URLs and titles/descriptions. What you should do is create completely unique copy, dynanic content and/or user engagement, local citations will help each landing page, and make sure to get local backlinks to each landing page.
-
While I certainly don't want to pretend to be able to predict anything Google might do, to me, the fact that you are thinking about this as being a potential problem should be enough to make you consider some options. Depending on how many pages you have, it may not be that difficult to get really truly original content produced for those other pages.
Will Google choose not to index you? I have no idea.
My guess is that you get indexed, but may not rank very high if the content is substantially similar on all of those pages. You might get stuck in the proverbial "sandbox." (ranked so low that no one can find you).
My gut says, if you have to ask "is this duplicate content?" It probably is, so make it unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap and Privacy Policy marked for duplicate content?
On a recent crawl, Moz flagged a page of our site for duplicate content. However, the pages listed are our sitemap and our privacy policy -- both very different: http://elearning.smp.org/sitemap/ http://elearning.smp.org/privacy-policy/ What is our best option to address this issue? I had considered a noindex tag on the privacy policy page, but since we have enabled user insights in Google Analytics we need to have the privacy policy displayed and I worry that putting a noindex on the page would cause problems later.
Web Design | | calliek0 -
Managing Removed Content
I am a Real Estate Developer. Once a home goes off market (is sold), I had been using a 404 for that page. The problem: When the home goes up on market again, google will not re-index the new page (same URL) I have also tried to manage it a different way. Instead of removing the page, I left it as-is. At some later point time, the house goes back up on the market. The page is refreshed with new content. However, google decides to use cached version. Please note in either case, the property appears on the main page for a period of indexing. I have been doing this for 10 years, the problem is increasing with time.
Web Design | | Buckey0 -
Requirements for mobile menu design have created a duplicated menu in the text/cache view.
Hi, Upon checking the text cache view of our home page, I noticed the main menu has been duplicated. Please see: http://webcache.googleusercontent.com/search?q=cache:http://www.trinitypower.com&strip=1 Our coder tells me he created one version for the desktop and one for the mobile version. Duplicating the menu cannot be good for on page SEO. With that said, I have had no warnings reported back from Moz. Maybe the moz bots are not tuned to looks for such a duplication error. Anyway, the reason the coder created a different menu for mobile in order to support the design requirements. I did not like the look and feel of the responsive version created based on the desktop version. Hi solution to this problem is to convert the Mobile version menu into ajax. what do you guys think? Thanks, Jarrett
Web Design | | TrinityPower0 -
Duplicate Product Descriptions for Each Variant
Hi, I am setting up a Shopify e-commerce store and I have a questions about duplicate product descriptions. I have written unique product descriptions for all our products. Each product has at least 10 color options. I am thinking that it would look better if I created each color variant as a unique product. i.e. store.com/nice-shirt-blue, store.com/nice-shirt-red ect. Here is the kicker. Would I be penalized for using the same product descriptions for each product type?
Web Design | | Jon_B0 -
Subdomains, duplicate content and microsites
I work for a website that generates a high amount of unique, quality content. This website though has had development issues with our web builder and they are going to separate the site into different subdomains upon launch. It's a scholarly site so the subdomains will be like history and science and stuff. Don't ask why aren't we aren't using subdirectories because trust me I wish we could. So we have to use subdomains and I'm wondering a couple questions. Will the duplication of coding, since all subdomains will have the same design and look, heavily penalize us and is there any way around that? Also if we generate a good amount of high quality content on each site could we link all those sites to our other site as a possible benefit for link building? And finally, would footer links, linking all the subdirectories, be a good thing to put in?
Web Design | | mdorville0 -
How do I optimize a site designed to be one scrolling page of content?
Our website uses section ID's as its navigation so all the content is on one page. When you click About Us, the page scrolls down to About Us. Products, the page scrolls to Products section, and etc. I am getting crawl errors for meta descriptions but will this go away once the main domain has this info? We just added the meta keywords and description to the header and since the navigation sections use the same page, I assume it will correct the errors. Any other advice on optimizing for site designs like ours would be great. www.theicecubekit.com is the site. Thanks,
Web Design | | bangbang
Chris0 -
Getting a lot more duplicate content warnings than I expected.
I run WordPress on many of my sites and a site crawl has found MANY duplicate content pages on the latest domain I started a campaign for. I expected to see quite a lot on the tag pages that only had one post but even tag pages with multiple posts and author and category pages with many posts are showing as duplicate content. Is this normal for a WordPress site to have so much duplicate content warnings from the taxonomy pages? I have the option to bulk noindex, follow the category and tag pages but should I do it? I get some traffic directly to the tag pages so removing the pages from search results would dent the traffic of the site a little (generally high bounce rate, low engagement traffic anyway) but could removing the apparent duplicate content actually improve the article pages themselves? Or does anyone have any WordPress specific advice for making the pages not duplicate content? I've toyed with the idea of just displaying excerpts but creating manual excerpts for the 4 years worth of posts, some of which I have no personal knowledge of the subject matter so other suggestions are welcome.
Web Design | | williampatton0 -
How much content is too much? Best Pages For Content?
To my understanding content has a lot to do with organic rankings if written correctly. My question is, how much content is too much and what pages are best to place content. Our company sells very costly products. Our customers call to purchase, we do not have an eCommerce site. Write now we have on average 350 words per page. We have about 200+ pages. Each page is written for that general category and each product has its own unique content. It seems to me that the pages with less content, tend to rank a bit better. As we are in the process of redoing our website, is there any recommendations on writing content, or adjusting the amount of text. I am thinking a lot of our text is informative only to a certain extent. Would writing content just for the main category page be better, and then on the actual product page, have only about 250 words as a description? Are there any other recommendations for SEO that are fairly new? Besides the Title, Description, Heading Tags, Image Alts, URLS etc.
Web Design | | hfranz0