Duplicate Content for e-commerce help
-
Hi.
I know I have duplicate content issues and Moz has shown me the issues on ecommerce websites.
However a large number of these issues are for variations of the same product. For example a blue, armani t-shirt can be found on armani page, t-shirt page, armani t-shirt page and it also shows links for the duplicates due to sizing variations.
Is it possible or even worthwhile working on these issues?
Thanks
-
Thanks.
We have issues with our filter at present (it is Ajax I think) as it doesn't recognise when a variation is out of stock. Which pretty much makes the point of a filter useless.
We need it tweaking which our web team are trying to do.
Thanks
-
Doing the canonical URL's will help with this.
If your cart system allows this, you could also have the sizing done through javascript as a variable, so that it does not change the URL for the different filters. This way you can still pass the values through for your store, but not have 5 different pages for every product.
-
They should but its always worth double checking.
-
I can put a Canonical URL on the product using Yoast. So I presume that all other variants of that product will go to that URL?
-
its a tag that sits on your webpage i'm sure you can get a plugin for Wordpress to do it, Yoast might do it. worst case web dev guy can do it pretty easy.
I also think this one would do the trick but i recommend doing a little research (i dont endorse this particular one) - http://wordpress.org/plugins/all-in-one-seo-pack/
-
Yeah that was the lines I was going down.
I just wasnt sure if it would be a big issue. The sizes/colours are all under a master product anyway so all point to the same URL.
Will my web guy do this or can I do within Wordpress/Webmaster tools?
-
Hello,
What you want is rel=canonical tag:
https://support.google.com/webmasters/answer/139066?hl=en
http://moz.com/learn/seo/canonicalization
what this will do it tell Google (or other search engines) that the product is a duplicate so what would do is tell Google that the Armani t-shirt is the original and put the tag on the colours pointing back to the Armani t-shirt page. then you won't have duplicates because any value will be passed to the original. The only minor problem is the other won't be indexed but if its just a colour issue this is not normally a problem.
I hope that helps and good luck!
(just edited the links to be links!)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
Cross Domain Duplicate Content
Hi, We want create 2 company websites and each to be targeted specific to different countries. The 2 countries are Australia and New Zealand. We have acquired 2 domains, company.com.au and company.co.nz . We want to do it like this and not use different hreflang on the same version for maximum ranking results in each country (correct?). Since both websites will be in English, inevitably some page are going to be the same. Are we facing any danger of duplicate content between the two sites, and if we do is there any solution for that? Thank you for your help!
White Hat / Black Hat SEO | | Tz_Seo0 -
Help identifying cause for total rank loss
Hello, Last week I noticed one of my pages decreased in rank for a particular query from #8 to #13. Although I had recently made a few minor edits to the page (added an introductory paragraph and left-column promo to increase word count), I thought the reason for the decrease was due to a few newly ranked pages that I hadn't seen before. In an attempt to regain my original position, I tried to optimize the meta title for the singular form of the word. After making this change, I fetched and rendered the page as Google (status = partial) and submitted the page for indexing (URL only, not including on-page links). Almost immediately after submitting, the page dropped from #13 out of the top 50. I've since changed the meta title back to what it was originally and let Google crawl and index the page on its own, but the page is still not in the top 50. Could the addition of the page description and left column promos tipped the scales of keyword stuffing? If I change everything back to the way it was originally, is it reasonable to think I should regain my original position below the new pages? Any insights would be greatly appreciated!
White Hat / Black Hat SEO | | jmorehouse0 -
Image Optimization & Duplicate Content Issues
Hello Everyone, I have a new site that we're building which will incorporate some product thumbnail images cut and pasted from other sites and I would like some advice on how to properly manage those images on our site. Here's one sample scenario from the new website: We're building furniture and the client has the option of selecting 50 plastic laminate finish options from the Formica company. We'll cut and paste those 50 thumbnails of the various plastic laminate finishes and incorporate them into our site. Rather than sending our website visitors over to the Formica site, we want them to stay put on our site, and select the finishes from our pages. The borrowed thumbnail images will not represent the majority of the site's content and we have plenty of our own images and original content. As it does not make sense for us to order 50 samples from Formica & photograph them ourselves, what is the best way to handle to issue? Thanks in advance, Scott
White Hat / Black Hat SEO | | ccbamatx0 -
Help figuring out if certain paid directories are worth it
The person in my position previously had quite a few paid directories our site was listed on. What is the best resources you guys have used or know of to figure out which ones are good to keep? For instance one that is up for renewal this week is site-sift.com. I know the person previous to me did some not so ethical stuff and I'm trying to clean up messes. Any advice on directories would be much appreciated.
White Hat / Black Hat SEO | | inhouseninja0 -
Black Hat Attack! Seeking Help
Hello, For the first time, I think my site has been the victim of a black hat (spam) attack 😞 I have a blog in a competitive niche and my rankings suddenly dropped (from top 3 to top 20). A quick peek at my latest backlinks using Open Site Explorer "Just Discovered" revealed some nasty looking comment spam links with my target keywords posted recently. Of course, I haven't hired anyone to post such links and I haven't done it myself. So my only guess is that a competitor has been generous enough to invest on spamming my site. Questions: 1. How can I confirm if this is in fact a spam attack? 2. Should I worry about this? 3. If so, what is the best way to go about this? Would appreciate any thoughts on this. Thanks in advance! Howard
White Hat / Black Hat SEO | | howardd1 -
Same content, different target area SEO
So ok, I have a gambling site that i want to target for Australia, Canada, USA and England separately and still have .com for world wide (or not, read further).The websites content will basically stays the same for all of them, perhaps just small changes of layout and information order (different order for top 10 gambling rooms) My question 1 would be: How should I mark the content for Google and other search engines that it would not be considered "duplicate content"? As I have mentioned the content will actually BE duplicate, but i want to target the users in different areas, so I believe search engines should have a proper way not to penalize my websites for trying to reach the users on their own country TLDs. What i thought of so far is: 1. Separate webmasterstools account for every domain -> we will need to setup the user targeting to specific country in it.
White Hat / Black Hat SEO | | SEO_MediaInno
2. Use the hreflang tags to indicate, that this content is for GB users "en-GB" the same for other domains more info about it http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
3. Get the country specific IP address (physical location of the server is not hugely important, just the IP)
4. It would be great if the IP address for co.uk is from different C-class than the one for the .com Is there anything I am missing here? Question 2: Should i target .com for USA market or is there some other options? (not based in USA so i believe .us is out of question) Thank you for your answers. T0 -
Possibly a dumb question - 301 from a banned domain to new domain with NEW content
I was wondering if banned domains pass any page rank, link love, etc. My domain got banned and I AM working to get it unbanned, but in the mean time, would buying a new domain, and creating NEW content that DOES adhere to the google quality guidelines, help at all? Would this force an 'auto-evaluation' or 're-evaluation' of the site by google? or would the new domain simply have ZERO effect from the 301 unless that old domain got into google's good graces again.
White Hat / Black Hat SEO | | ilyaelbert0