Duplicate content across similar computer "models" and how to properly handle it.
-
I run a website that revolves around a niche rugged computer market. There are several "main" models for each computer that also has several (300-400) "sub" models that only vary by specifications for each model. My problem is I can't really consolidate each model to one product page to avoid duplicate content. To have something like a drop down list would be massive and confusing to the customer when they could just search the model they needed. Also I would say 80-90% of the market searches for a specific model when they go to purchase or in Google. A lot of our customers are city government, fire departments, police departments etc. they get a list of approved models and purchase off that they don't really search by specs or "configure" a model so each model number having a chance to rank is important. Currently we have all models in each sub category rel=canonical back to the main category page for that model. Is there a better way to go about this? Example page you can see how there are several models all product descriptions are the same they only vary by model writing a unique description for each one is an unrealistic possibility for us. Any suggestions on this would be appreciated I keep going back on forth on what the correct solution would be.
-
Do people tend to search for "CF-19" in the Toshiba example, or do they tend to search for "CF-1956Y6XLM"?
If it's CF-19 then I would add more value to the example pages, and not worry about the subpages as much. But, I'm guessing that it's the specific model numbers, in which case the ideal situation is to be able to index an exact page for that model number. If you take a look at the "CF-1956Y6XLM" example, PC World is ranking #1 pretty much on all spec content, meaning they're coasting on domain authority to rank those pages. Meanwhile I see you guys at #4. Typically I would suggest that it's a bad plan to go with really thin content, but if everyone else is doing it, you may not need 200-300 words to move up in the rankings. Try producing 50-75 custom words on 100 of these pages where you're ranking Top 5. Do it for models that are newer so you can monitor ranking improvement over time. If the ranking and traffic improvements happen, and they convert, then figure out if you can scale that process up for every new incoming product.
Other SERP benefits can beat rankings here, too. If you can get legitimate product ratings and generate some rich snippets for the products, that will help maximize your CTR. Try to write better meta descriptions, too - right now they're all pretty drab on that SERP example.
Martijn's suggestion of reviews is a good start but will probably only help on 10-20% of pages that you're able to get reviews on. Nevertheless, probably worth the effort.
Some e-commerce platforms will allow you to save a single product with variations, which helps with this problem. If 10 models can share a page, and be selected with a product sub menu (like the t-shirt size or color selector on a fashion ecommerce site) then that is a good way to cut down on total URLs by 50-90%. But, I'd try the unique content route first and see if the numbers add up.
-
I was afraid of this answer. If it was a static product I would be happy to do this but since it is technology in 6-8 months the next "generation" will be out with new models numbers needing descriptions for each one to be re-written which is incredibly difficult to keep up with.
Is there a middle of the road option? is rel=canonical my best choice if I can't do unique content for every single model?
If so is there a way to maximize the benefit of rel=canonical in this situation?
-
Reviews can work perfectly for user generated content to make sure that the content is a bit more unique. It's an easy one and I'm probably hitting an open door here but depending on how much products you sell for a specific version it might help you to extend both the content and make it more unique.
-
It's a very tough question and one that is common with a lot of e-commerce.
The only really complete solution I have for you that addresses each of your needs is to not base the page "content" on the specs.
Make specs a table on the page but put in enough unique content about each model and variation that it has its own truly unique content.
I know this solution means writing at least say 200-300 words of unique content for every model but 100k words solves the whole issue. It just depends if it is worth them all ranking. But this solution gives you:
a) unique content
b) chance for every page to rank & no canonicals back to one page
c) much more long tail search volume
d) specific searches for every one of your potential customers.
That's really the best I can do ... it takes the duplicate content issue away and solves every problem except the one of having to create this much content in the first place.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"WWW" versus non "WWW" on domain
We plan on migrating our site to a new shorter domain name. I like the idea of removing "www" to gain an additional 3 letters in the URL display. Is there any disadvantage of doing so from a technical or SEO perspective? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
SEO for video content that is duplicated accross a larger network
I have a website with lots of content (high quality video clips for a particular niche). All the content gets fed out 100+ other sites on various domains/subdomains which are reskinned for a given city. So the content on these other sites is 100% duplicate. I still want to generate SEO traffic though. So my thought is that we: a) need to have canonical tags from all the other domains/subdomains that point back to the original post on the main site b) probably need to disallow search engine crawlers on all the other domains/subdomains Is this on the right track? Missing anything important related to duplicate content? The idea is that after we get search engines crawling the content correctly, from there we'd use the IP address to redirect the visitor to the best suited domain/subdomain. any thoughts on that approach? Thanks for your help!
Intermediate & Advanced SEO | | PlusROI0 -
Duplicate page content on numerical blog pages?
Hello everyone, I'm still relatively new at SEO and am still trying my best to learn. However, I have this persistent issue. My site is on WordPress and all of my blog pages e.g page one, page two etc are all coming up as duplicate content. Here are some URL examples of what I mean: http://3mil.co.uk/insights-web-design-blog/page/3/ http://3mil.co.uk/insights-web-design-blog/page/4/ Does anyone have any ideas? I have already no indexed categories and tags so it is not them. Any help would be appreciated. Thanks.
Intermediate & Advanced SEO | | 3mil0 -
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Problems with ecommerce filters causing duplicate content.
We have an ecommerce website with 700 pages. Due to the implementation of filters, we are seeing upto 11,000 pages being indexed where the filter tag is apphended to the URL. This is causing duplicate content issues across the site. We tried adding "nofollow" to all the filters, we have also tried adding canonical tags, which it seems are being ignored. So how can we fix this? We are now toying with 2 other ideas to fix this issue; adding "no index" to all filtered pages making the filters uncrawble using javascript Has anyone else encountered this issue? If so what did you do to combat this and was it successful?
Intermediate & Advanced SEO | | Silkstream0 -
Duplicate Page Content / Titles Help
Hi guys, My SEOmoz crawl diagnostics throw up thousands of Dup Page Content / Title errors which are mostly from the forum attached to my website. In-particular it's the forum user's profiles that are causing the issue, below is a sample of the URLs that are being penalised: http://www.mywebsite.com/subfolder/myforum/pop_profile.asp?mode=display&id=1308 I thought that by adding - http://www.mywebsite.com/subfolder/myforum/pop_profile.asp to my robots.txt file under 'Ignore' would cause the bots to overlook the thousands of profile pages but the latest SEOmoz crawl still picks them up. My question is, how can I get the bots to ignore these profile pages (they don't contain any useful content) and how much will this be affecting my rankings (bearing in mind I have thousands of errors for dup content and dup page titles). Thanks guys Gareth
Intermediate & Advanced SEO | | gaz33420 -
Original content, widely quoted - yet ignored by Google
Our website is https://greatfire.org. We are a non-profit working to bring transparency to online censorship in China. By helping us resolve this problem you are helping us in the cause of internet freedom. If you search for "great firewall" or "great firewall of china", would you be interested in finding a database of what websites and searches are blocked by this Great Firewall of China? We have been running a non-profit project with this objective for almost a year and in so doing have created the biggest and most updated database of online censorship in China. Yet, to this date, you cannot find it in Google by searching for any relevant keywords. A similar website, www.greatfirewallofchina.org, is listed as #3 when searching for "great firewall". Our website provides a more accurate testing tool, as well as historic data. Regardless of whether our service is better, we believe we should at least be included in the top 10. We have been testing out an Adwords campaign to see whether our website is of interest to users using these keywords. For example, users searching for "great firewall of china" end up browsing on average 2.62 pages and spending 03:18 minutes on the website. This suggests to us that our website is of interest to users searching for these keywords. Do you have any idea what the problem could be that is grave enough to not even include us in the top 100 for these keywords? We have recently posted this same question on the Google Webmaster Central but did not get a satisfactory answer: http://www.google.com/support/forum/p/Webmasters/thread?tid=5c14a7e16c07cbb7&hl=en&fid=5c14a7e16c07cbb70004b5f1d985e70e
Intermediate & Advanced SEO | | GreatFire.org0