Handling duplicate content, whilst making both rank well
-
Hey MOZperts,
I run a marketplace called Zibbet.com and we have 1000s of individual stores within our marketplace. We are about to launch a new initiative giving all sellers their own stand-alone websites.
URL structure:
Marketplace URL: http://www.zibbet.com/pillowlink
Stand-alone site URL: http://pillowlink.zibbet.com (doesn't work yet)Essentially, their stand-alone website is a duplicate of their marketplace store. Same items (item title, description), same seller bios, same shop introduction content etc but it just has a different layout. You can scroll down and see a preview of the different pages (if that helps you visualize what we're doing), here.
My Questions: My desire is for both the sellers marketplace store and their stand-alone website to have good rankings in the SERPS.
- Is this possible?
- Do we need to add any tags (e.g. "rel=canonical") to one of these so that we're not penalized for duplicate content? If so, which one?
- Can we just change the meta data structure of the stand-alone websites to skirt around the duplicate content issue?
Keen to hear your thoughts and if you have any suggestions for how we can handle this best.
Thanks in advance!
-
No. We're actually not launching this initiative for SEO purposes. We just want to create value for our users and having their own stand-alone website is valuable to them.
I just want to make sure we're structured properly from an SEO point of view so that we don't compromise the SEO of our marketplace, or their stand-alone site.
Also, each site has unique content, but it is identical data to their marketplace store. So, every seller has a marketplace store (with items, a profile etc) AND a stand-alone website (with the same items, same profile etc, just designed differently and accessible via a sub-domain).
Hope that makes sense.
-
Thanks so much for your input.
I must admit, I'm not too familiar with Panda, so will need to do some digging there. We literally launched the new version of Zibbet 2 months ago, with different meta data etc, so I'm not sure how that affects it.
If we don't add the rel=canonical, do you think we'll get punished by Google?
-
If I understand correctly, you're asking how you can create a business model that fills up the search results with a bunch of sites that all have the same content. I think you're somewhat late to that party. The Google of today doesn't really let you do that and it's pretty good at preventing it. And if you were thinking of maybe linking back to your main site from all those dupes, I'd rethink that strategy, as well.
-
Hi,
First of all I would really take care of that Panda issue you have there: http://screencast.com/t/SzbL6hTFwWr
To answer your questions:
- Is this possible?
They can't rank both. You need to decide canonical version - the one to rule them all
- Do we need to add any tags (e.g. "rel=canonical") to one of these so that we're not penalized for duplicate content? If so, which one?
There are no duplicate content issues but yes, it's best that you chose and don't let google do that for you. Add rel canonical.
- Can we just change the meta data structure of the stand-alone websites to skirt around the duplicate content issue?
That won't help.
Overall you got hit by Panda already - you should take care of what content you push into the index as that's not the way to recover. before pushing more content into the index you should clean the site for all issues related with Panda.
Cheers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Could duplicate (copied) content actually hurt a domain?
Hi 🙂 I run a small wordpress multisite network where the main site which is an informative portal about the Langhe region in Italy, and the subsites are websites of small local companies in the tourism and wine/food niche. As an additional service for those who build a website with us, I was thinking about giving them the possibility to use some ouf our portal's content (such as sights, events etc) on their website, in an automatic way. Not as an "SEO" plus, but more as a service for their current users/visitors base: so if you have a B&B you can have on your site an "events" section with curated content, or a section about thing to see (monuments, parks, museums, etc) in that area, so that your visitors can enjoy reading some content about the territory. I was wondering if, apart from NOT being benefical, it would be BAD from an SEO point of view... ie: if they could be actually penlized by google. Thanks 🙂 Best
Intermediate & Advanced SEO | | Enrico_Cassinelli0 -
Please help - Duplicate Content
Hi, I am really struggling to understand why my site has a lot of duplicate content issues. It's flagging up as ridiculously high and I have no idea how to fix this, can anyone help me, please? Website is www.firstcapitol.co.uk
Intermediate & Advanced SEO | | Alix_SEO1 -
301 redirect to avoid duplicate content penalty
I have two websites with identical content. Haya and ethnic Both websites have similar products. I would like to get rid of ethniccode I have already started to de-index ethniccode. My question is, Will I get any SEO benefit or Will it be harmful if I 301 direct the below only URL’s https://www.ethniccode/salwar-kameez -> https://www.hayacreations/collections/salwar-kameez https://www.ethniccode/salwar-kameez/anarkali-suits - > https://www.hayacreations/collections/anarkali-suits
Intermediate & Advanced SEO | | riyaaaz0 -
Http vs. https - duplicate content
Hi I have recently come across a new issue on our site, where https & http titles are showing as duplicate. I read https://moz.com/community/q/duplicate-content-and-http-and-https however, am wondering as https is now a ranking factor, blocked this can't be a good thing? We aren't in a position to roll out https everywhere, so what would be the best thing to do next? I thought about implementing canonicals? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
How should I manage duplicate content caused by a guided navigation for my e-commerce site?
I am working with a company which uses Endeca to power the guided navigation for our e-commerce site. I am concerned that the duplicate content generated by having the same products served under numerous refinement levels is damaging the sites ability to rank well, and was hoping the Moz community could help me understand how much of an impact this type of duplicate content could be having. I also would love to know if there are any best practices for how to manage this type of navigation. Should I nofollow all of the URLs which have more than 1 refinement used on a category, or should I allow the search engines to go deeper than that to preserve the long tail? Any help would be appreciated. Thank you.
Intermediate & Advanced SEO | | FireMountainGems0 -
Magento Duplicate Content Recovery
Hi, we switched platforms to Magento last year. Since then our SERPS rankings have declined considerably (no sudden drop on any Panda/Penguin date lines). After investigating, it appeared we neglected to No index, follow all our filter pages and our total indexed pages rose sevenfold in a matter of weeks. We have since fixed the no index issue and the pages indexed are now below what we had pre switch to Magento. We've seen some positive results in the last week. Any ideas when/if our rankings will return? Thanks!
Intermediate & Advanced SEO | | Jonnygeeuk0 -
Handling Similar page content on directory site
Hi All, SEOMOZ is telling me I have a lot of duplicate content on my site. The pages are not duplicate, but very similar, because the site is a directory website with a page for cities in multiple states in the US. I do not want these pages being indexed and was wanting to know the best way to go about this. I was thinking I could do a rel ="nofollow" on all the links to those pages, but not sure if that is the correct way to do this. Since the folders are deep within the site and not under one main folder, it would mean I would have to do a disallow for many folders if I did this through Robots.txt. The other thing I am thinking of is doing a meta noindex, follow, but I would have to get my programmer to add a meta tag just for this section of the site. Any thoughts on the best way to achieve this so I can eliminate these dup pages from my SEO report and from the search engine index? Thanks!
Intermediate & Advanced SEO | | cchhita0 -
Duplicate Content Issue
Why do URL with .html or index.php at the end are annoying to the search engine? I heard it can create some duplicate content but I have no idea why? Could someone explain me why is that so? Thank you
Intermediate & Advanced SEO | | Ideas-Money-Art0