How to avoid duplicate content with e-commerce and multiple stores?
-
We are currently developing an e-commerce platform that will feed multiple stores. Each store will have its own domain and URL, but all stores will offer products that come from the same centralized database.
That means all products will have the same image, description and title across all stores. What would be the best practice to avoid getting stores penalized for duplicate content?
-
Hi there
Here are some great resources for you and your time to explore:
Thin & Duplicate Content: eCommerce SEO
7 Critical SEO Errors of E-commerce Websites
The Ultimate Guide to SEO for E-commerce Websites I would also suggest looking into Google's duplicate content resources as there are tons of great tips for you to look into, including pagination, parameters, and canonical tags.I would make sure that your on-site SEO is unique for each product as well.
Hope this all helps - good luck!
-
That means all products will have the same image, description and title across all stores. What would be the best practice to avoid getting stores penalized for duplicate content?
You will be penalized for duplicate content if you do this. At some point in the future it will happen. Tellin' you right now this plan is doomed long term.
So, you can wait for it to happen or you can do a proper job from the start and create unique, substantive content for each website.
-
The first question is why are you going to have many stores doing the same thing, is each one unique in some way?
Without more information as to why you are doing it this way it is hard to help with the best way to handle it. -
Hi there,
As you're aware it is best to have unique content across the different sites.
An approach could be to have Information on the products that are only displayed on the separate websites so your data could look like:- Title
- Description
- Price
- Site 1 introduction
- Site 2 Introduction
- Site 3 Introduction
Then each site can have different content whilst maintaining one centralised dataset.
Hope this inspires!
Kind Regards
Jimmy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on recruitment website
Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
E Commerce site - removing discontinued items
We have been hit with a Panda penalty and the site has slowly been losing rankings since January, I've now realised that we have 4000+ page indexed in Google, but only 2000 live products. We have never deleted any of the pages with discontinued items, most of which were created when keyword stuffing and thin content reigned supreme - which explains the Panda penalty. But which is the best and quickest way to delete them from Google? We have already implemented a 'noindex' across all these pages, but as they are no longer in the 'crawlable' site, how will Google find them to know this? Would a 404 work any better - I'm not concerned about any link juice etc to/from these pages, I just want rid. I'm not sure if we can move all these pages into a dedicated directory which would allow us to use Google's Removal Tool - using it with the individual urls would be a mammoth task. Any advice would be most greatly appreciated.
Intermediate & Advanced SEO | | ElaineAllkids0 -
Need help with duplicate content. Same content; different locations.
We have 2 sites that will have duplicate content (e.g., one company that sells the same products under two different brand names for legal reasons). The two companies are in different geographical areas, but the client will put the same content on each page because they're the same product. What is the best way to handle this? Thanks a lot.
Intermediate & Advanced SEO | | Rocket.Fuel0 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0 -
Best Practices for Pagination on E-commerce Site
One of my e-commerce clients has a script enabled on their category pages that allows more products to automatically be displayed as you scroll down. They use this instead of page 1, 2, and a view all. I'm trying to decide if I want to insist that they change back to the traditional method of multiple pages with a view all button, and then implement rel="next", rel="prev", etc. I think the current auto method is disorienting for the user, but I can't figure out if it's the same for the spiders. Does anyone have any experience with this, or thoughts? Thanks!
Intermediate & Advanced SEO | | smallbox0 -
Guest blogging and duplicate content
I have a guest blog prepared and several sites I can submit it to, would it be considered duplicate content if I submitted one guest blog post to multipul blogs? and if so this content is not on my site but is linking to it. What will google do? Lets say 5 blogs except the same content and post it up, I understand that the first blog to have it up will not be punished, what about the rest of the blogs? can they get punished for this duplicate content? can I get punished for having duplicate content linking to me?
Intermediate & Advanced SEO | | SEODinosaur0 -
Can PDF be seen as duplicate content? If so, how to prevent it?
I see no reason why PDF couldn't be considered duplicate content but I haven't seen any threads about it. We publish loads of product documentation provided by manufacturers as well as White Papers and Case Studies. These give our customers and prospects a better idea off our solutions and help them along their buying process. However, I'm not sure if it would be better to make them non-indexable to prevent duplicate content issues. Clearly we would prefer a solutions where we benefit from to keywords in the documents. Any one has insight on how to deal with PDF provided by third parties? Thanks in advance.
Intermediate & Advanced SEO | | Gestisoft-Qc1 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0