Decline in traffic and duplicate content in different domains
-
Hi,
6 months ago my customer purchased their US supplier and moved the supplier's website to their e-commerce platform. When moving to the new platform they copied the descriptions of the products from their site to the supplier's site so now both sites have the same content in the product pages. Since then they have experienced decrease in traffic in about 80%.
They didn't implement canonical tag or hreflang.
My customer's domain format is https://www.xxx.biz and the supplier's domain is https://www.zzz.com
The last one is targeting the US and when someone from outside of the US wants to purchase a product they get a message that they need to move to the first website, the www.xxx.biz.
Both sites are in English.
The old site version of www.zzz.com, before the shit to the new platform, contained different product descriptions, and BTW, the old website version is still live and indexed under a subdomain of www.zzz.com.
My question is what's the best thing to do in this case so that the rankings will be back to higher positions and they'll get back their traffic.
Thanks!
-
Wow, there are a lot of things going on here so hopefully I've understood it all correctly.
By the sound of it, your client purchased the supplier then populated that supplier's website with their own content?
If this is the case, I'd expect their biggest issue will be the duplication. If they've now got 2 websites with the same content, it's very likely this alone with limit the second site's ability to rank well.
While having duplicate product descriptions isn't the end of the world, it certainly doesn't help. In this scenario they've taken unique content and replaced it with duplicate and so the drop you're seeing is exactly what I would have expected.
My question is what's the best thing to do in this case so that the rankings will be back to higher positions and they'll get back their traffic.
If I were in your shoes, I'd be looking to get as much uniqueness happening in that content as I could. Since the supplier's site had original content for the same products, you could try reverting to that for the time being. You can test it with a single category as a proof of concept and if that works, move ahead from there.
If you're going to try that, make sure you block the subdomain via Robots.txt so Google isn't crawling those old descriptions there or you'll end up in the same position
This is all general info since I can't look closer at the site. It could also be a site structure/speed/navigation/page title/meta description problem etc. If you're comfortable sending me the links I'd be happy to take a closer look. Feel free to drop it here or PM.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Way to Handle Near-Duplicate Content?
Hello Dear MOZers, Having duplicate content issues and I'd like some opinions on how best to deal with this problem. Background: I run a website for a cosmetic surgeon in which the most valuable content area is the section of before/after photos of our patients. We have 200+ pages (one patient per page) and each page has a 'description' block of text and a handful of before and after photos. Photos are labeled with very similar labels patient-to-patient ("before surgery", "after surgery", "during surgery" etc). Currently, each page has a unique rel=canonical tag. But MOZ Crawl Diagnostics has found these pages to be duplicate content of each other. For example, using a 'similar page checker' two of these pages were found to be 97% similar. As far as I understand there are a few ways to deal with this, and I'd like to get your opinions on the best course. Add 150+ more words to each description text block Prevent indexing of patient pages with robots.txt Set the rel=canonical for each patient page to the main gallery page Any other options or suggestions? Please keep in mind that this is our most valuable content, so I would be reluctant to make major structural changes, or changes that would result in any decrease in traffic to these pages. Thank you folks, Ethan
Technical SEO | | BernsteinMedicalNYC0 -
Finding a specific link - Duplicating my own content
Hi Mozzers, This may be a bit of a n00b question and i feel i should know the answer but alas, here i am asking. I have a page www.website.co.uk/page/ and im getting a duplicate page report of www.website.co.uk/Page/ i know this is because somewhere on my website a link will exists using the capitalised version. I have tried everything i can think of to find it but with no luck, any little tricks? I could always rewrite the urls to lowercase, but I have downloadable software etc also on the website that i dont want to take the capitals out of. So the best solution seems to be finding the link and remove it. Most link checkers I use treat the capitalised and non capitalised as the same thing so really arent helping lol.
Technical SEO | | ATP0 -
Responsive Code Creating Duplicate Content Issue
Good morning, Our developers have recently created a new site for our agency. The site is responsive for mobile/tablets. I've just put the site through Screaming Frog and I've been informed of duplicate H2s. When I've looked at some of the page sources, there are some instances of duplicated H2s and duplicated content. These duplicates don't actually appear on the site, only in the code. When I asked the development guys about this, they advised this is duplicated because of the code for the responsive site. Will the site be negatively affected because of this? Not everything is duplicated, which leads me to believe it probably could have been designed better... but I'm no developer so don't know for sure. I've checked the code for other responsive sites and no duplicates can be found. Thanks in advance, Lewis
Technical SEO | | PeaSoupDigital0 -
Duplicate content or titles
Hello , I am working on a site, I am facing the duplicate title and content errors,
Technical SEO | | KLLC
there are following kind of errors : 1- A link with www and without www having same content. actually its a apartment management site, so it has different bedrooms apartments and booking pages , 2- my second issue is related to booking and details pages of bedrooms, because I am using 1 file for all booking and 1 file for all details page. these are the main errors which i am facing ,
can anyone give me suggestions regarding these issues ? Thnaks,0 -
Duplicate Content in Dot Net Nuke
Our site is built on Dot Net Nuke. SEOmoz shows a very large amount of duplicate content because at the beginning each page got an extension in the following format: www.domain.com/tabid/110/Default.aspx The site additionally exists without the tabid... part. Our web developer says an easy fix with a canonical tag or 301 redirect is not possible. Does anyone have DNN experience and can point us in the right direction? Thanks, Ricarda
Technical SEO | | jsillay0 -
Duplicate content error - same URL
Hi, One of my sites is reporting a duplicate content and page title error. But it is the same page? And the home page at that. The only difference in the error report is a trailing slash. www.{mysite}.co.uk www.{mysite}.co.uk/ Is this an easy htaccess fix? Many thanks TT
Technical SEO | | TheTub1 -
Thin/Duplicate Content
Hi Guys, So here's the deal, my team and I just acquired a new site using some questionable tactics. Only about 5% of the entire site is actually written by humans the rest of the 40k + (and is increasing by 1-2k auto gen pages a day)pages are all autogen + thin content. I'm trying to convince the powers that be that we cannot continue to do this. Now i'm aware of the issue but my question is what is the best way to deal with this. Should I noindex these pages at the directory level? Should I 301 them to the most relevant section where actual valuable content exists. So far it doesn't seem like Google has caught on to this yet and I want to fix the issue while not raising any more red flags in the process. Thanks!
Technical SEO | | DPASeo0