Duplicate content across multiple domains
-
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain.
My question relates to the removal of these pages. There are thousands of these duplicate pages.
I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen?
Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site.
I suppose another option would be to include no cache meta tag for these pages.
Any thoughts or comments would be appreciated.
-
I went ahead and added the links to the sitemap, however when google crawled the links I receieve this message.
When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL.
However I do not understand how adding the redirected links to the sitemap will remove the old links.
-
Worth a shot. Crawl bots usually work by following links from page to the next. If links links no longer exist to those pages, then Google will have a tough time finding those pages and de-indexing them in favor or the correct pages.
Good luck!
-
One of the previous developers left a hole that caused this issue. The system shares code between sites.
-
Andrew,
The links were removed from the offending sites, but If I understand the gist of your suggestion Google won't remove them as quickly if they are no longer linked and yes I am using canonical tags. So I should create a sitemap with the previous links and once Google follows these links to the main site remove the sitemap. Is that your recommendation?
I suppose I can try this first before filing a request to remove the entire site.
-
Ah, I thought he was saying the dupe content does still exists but no more duplication is taking place after the fix. That's where I was going wrong then lol.
-
As long as the duplicate content pages no longer exist and you've set up the 301 redirects properly, this shouldn't be a long term problem. It can sometimes take Google a while to crawl through 1000's of pages to index the correct pages. You might want to include these pages in a Sitemap to speed up the process, particularly if there are no longer any links to these pages from anywhere else. Are you using canonical tags? They might also help point Google in the right direction.
I don't think a no cache meta tag would help. This is assuming the page will be crawled and by that point Google should follow the 301 and cace that page.
Hope this helps! Let me know how the situation progresses.
Andrew
-
Do you want the smaller sites to still exist? If they don't matter at all then you could always take them offline though that's not recommended for obvious reasons (but it would get them out of the index fairly quick).
If they still need to exist then we're just back to the same thing, changing the content on them. If the problem has been fixed to stop further duplication then that's fine... you could limit the damage by having all of those smaller sites be dupes of each other but not of the main site by rewriting the smaller ones with one lot of content, or the main one. At least that way they will only be competing with each other and not the main site any more.
Or have I still got the wrong end of the stick?
-
I am referring to an e-commerce site, so yes its dynamic. The hole has been plugged (so to speak) but the content still exists in the google cache.
-
Ah I see, so it's a CMS which pumps out content then?
But it pumps it to other sites?
-
Steve, Maybe I haven't explained the issue in enough detail. The duplicate content issue is related to a technical issue with the site causing the content to be duplicated when it should not have been. Its not a matter of rewriting content. My issue deals with purging this content from these other domains so that the main domain can be indexed with this content.
-
You could always just rewrite the content so it's not duplicate, that way you get to keep them cached and maybe focus on some different but still targeted long tail traffic... turn a negative into a positive. I accept thousands of pages is a lot of work, but there's a million and one online copywriters who are pretty good (and cheap) that you could assign projects to for it. Google copywriters for hire or freelance copywriters... could have it done in no time and not spend that much
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
How to avoid duplicate content
Hi, I have a website which is ranking on page 1: www.oldname.com/landing-page But because of legal reason i had to change the name.
Technical SEO | | mikehenze
So i moved the landing page to a different domain.
And 301'ed this landing page to the new domain (and removed all products). www.newname.com/landing-page All the meta data, titles, products are still the same. www.oldname.com/landing-page is still on the same position
And www.newname.com/landing-page was on page 1 for 1 day and is now on page 4. What did i do wrong and how can I fix this?
Maybe remove www.oldname.com/landing-page from Google with Google Webmaster Central or not allow crawling of this page with .htaccess ?0 -
Looking for a technical solution for duplicate content
Hello, Are there any technical solutions to duplicate content similar to the nofollow tag? A tag which can indicate to Google that we know that this is duplicate content but we want it there because it makes sense to the user. Thank you.
Technical SEO | | FusionMediaLimited0 -
Duplicate content
I have two page, where the second makes a duplicate content from the first Example:www.mysite.com/mypagewww.mysite.com/mysecondpageIf i insert still making duplicate content?Best regards,Wendel
Technical SEO | | peopleinteractive0 -
A site I am working with has multiple duplicate content issues.
A reasonably large ecommerce site I am working with has multiple duplicate content issues. On 4 or 5 keyword domains related to site content the owners simply duplicated the home page with category links pushing visitors to the category pages of the main site. There was no canonical URL instruction, so have set preferred url via webmaster tools but now need to code this into the website itself. For a reasonably large ecommerce site, how would you approach that particular nest of troubles. That's even before we get to grips with the on page duplication and wrong keywords!
Technical SEO | | SkiBum0 -
How can i resolve Duplicate Page Content?
Hello, I have created one campaign over SEOmoz tools for my website AutoDreams.it i have found 159 duplicate page content. My problem is that this web site is about car adsso it is easy to create pages with duplicate content and also Car ads are placed byregistered users. How can i resolve this problem? Regards Francesco
Technical SEO | | francesco870 -
Multiple URLs in CMS - duplicate content issue?
So about a month ago, we finally ported our site over to a content management system called Umbraco. Overall, it's okay, and certainly better than what we had before (i.e. nothing - just static pages). However, I did discover a problem with the URL management within the system. We had a number of pages that existed as follows: sparkenergy.com/state/name However, they exist now within certain folders, like so: sparkenergy.com/about-us/service-map/name So we had an aliasing system set up whereby you could call the URL basically whatever you want, so that allowed us to retain the old URL structure. However, we have found that the alias does not override, but just adds another option to finding a page. Which means the same pages can open under at least two different URLs, such as http://www.sparkenergy.com/state/texas and http://www.sparkenergy.com/about-us/service-map/texas. I've tried pointing to the aliased URL in other parts of the site with the rel canonical tag, without success. How much of a problem is this with respect to duplicate content? Should we bite the bullet, remove the aliased URLs and do 301s to the new folder structure?
Technical SEO | | ufmedia0 -
WordPress Duplicate Content Issues
Everyone knows that WordPress has some duplicate content issues with tags, archive pages, category pages etc... My question is, how do you handle these issues? Is the smart strategy to use robots meta and add no follow/ no index category pages, archive pages tag pages etc? By doing this are you missing out on the additional internal links to your important pages from you category pages and tag pages? I hope this makes sense. Regards, Bill
Technical SEO | | wparlaman0