Duplicate content across multiple domains
-
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain.
My question relates to the removal of these pages. There are thousands of these duplicate pages.
I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen?
Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site.
I suppose another option would be to include no cache meta tag for these pages.
Any thoughts or comments would be appreciated.
-
I went ahead and added the links to the sitemap, however when google crawled the links I receieve this message.
When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL.
However I do not understand how adding the redirected links to the sitemap will remove the old links.
-
Worth a shot. Crawl bots usually work by following links from page to the next. If links links no longer exist to those pages, then Google will have a tough time finding those pages and de-indexing them in favor or the correct pages.
Good luck!
-
One of the previous developers left a hole that caused this issue. The system shares code between sites.
-
Andrew,
The links were removed from the offending sites, but If I understand the gist of your suggestion Google won't remove them as quickly if they are no longer linked and yes I am using canonical tags. So I should create a sitemap with the previous links and once Google follows these links to the main site remove the sitemap. Is that your recommendation?
I suppose I can try this first before filing a request to remove the entire site.
-
Ah, I thought he was saying the dupe content does still exists but no more duplication is taking place after the fix. That's where I was going wrong then lol.
-
As long as the duplicate content pages no longer exist and you've set up the 301 redirects properly, this shouldn't be a long term problem. It can sometimes take Google a while to crawl through 1000's of pages to index the correct pages. You might want to include these pages in a Sitemap to speed up the process, particularly if there are no longer any links to these pages from anywhere else. Are you using canonical tags? They might also help point Google in the right direction.
I don't think a no cache meta tag would help. This is assuming the page will be crawled and by that point Google should follow the 301 and cace that page.
Hope this helps! Let me know how the situation progresses.
Andrew
-
Do you want the smaller sites to still exist? If they don't matter at all then you could always take them offline though that's not recommended for obvious reasons (but it would get them out of the index fairly quick).
If they still need to exist then we're just back to the same thing, changing the content on them. If the problem has been fixed to stop further duplication then that's fine... you could limit the damage by having all of those smaller sites be dupes of each other but not of the main site by rewriting the smaller ones with one lot of content, or the main one. At least that way they will only be competing with each other and not the main site any more.
Or have I still got the wrong end of the stick?
-
I am referring to an e-commerce site, so yes its dynamic. The hole has been plugged (so to speak) but the content still exists in the google cache.
-
Ah I see, so it's a CMS which pumps out content then?
But it pumps it to other sites?
-
Steve, Maybe I haven't explained the issue in enough detail. The duplicate content issue is related to a technical issue with the site causing the content to be duplicated when it should not have been. Its not a matter of rewriting content. My issue deals with purging this content from these other domains so that the main domain can be indexed with this content.
-
You could always just rewrite the content so it's not duplicate, that way you get to keep them cached and maybe focus on some different but still targeted long tail traffic... turn a negative into a positive. I accept thousands of pages is a lot of work, but there's a million and one online copywriters who are pretty good (and cheap) that you could assign projects to for it. Google copywriters for hire or freelance copywriters... could have it done in no time and not spend that much
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Despite canonical duplicate content in WMT
Hi, 2 weeks ago we've made big changes in title and meta descriptions. To solve the missing title and descriptions. Also set the right canonical. Now i see that in WMT despite the canonical it shows duplicates in meta descriptions and titles. i've setup the canonical like this:
Technical SEO | | Leonie-Kramer
1. url: www.domainname.com/category/listing-family/productname
2. url: www.domainname.com/category/listing-family/productname-more-info The canonical on both pages is like this: I'm aware of creating duplicate titles and descriptions, caused by the cms we use and also caused by wrong structure of category/products (we'll solve that nest year) that's why i wanted the canonical, but now it's not going any better, did i do something wrong with the canonical?0 -
Duplicate Content for Multiple Instances of the Same Product?
Hi again! We're set to launch a new inventory-based site for a chain of car dealers with various locations across the midwest. Here's our issue: The different branches have overlap in the products that they sell, and each branch is adamant that their inventory comes up uniquely in site search. We don't want the site to get penalized for duplicate content; however, we don't want to implement a link rel=canonical because each product should carry the same weight in search. We've talked about having a basic URL for these product descriptions, and each instance of the inventory would be canonicalized to this main product, but it doesn't really make sense for the site structure to do this. Do you have any tips on how to ensure that these products (same description, new product from manufacturer) won't be penalized as duplicate content?
Technical SEO | | newwhy0 -
Duplicate Content in Wordpress.com
Hi Mozers! I have a client with a blog on wordpress.com. http://newsfromtshirts.wordpress.com/ It just had a ranking drop because of a new Panda Update, and I know it's a Dupe Content problem. There are 3900 duplicate pages, basically because there is no use of noindex or canonical tag, so archives, categories pages are totally indexed by Google. If I could install my usual SEO plugin, that would be a piece of cake, but since Wordpress.com is a closed environment I can't. How can I put a noindex into all category, archive and author peges in wordpress.com? I think this could be done by writing a nice robot.txt, but I am not sure about the syntax I shoud use to achieve that. Thank you very much, DoMiSol Rossini
Technical SEO | | DoMiSoL0 -
Duplicate Footer Content
A client I just took over is having some duplicate content issues. At the top of each page he has about 200 words of unique content. Below this is are three big tables of text that talks about his services, history, etc. This table is pulled into the middle of every page using php. So, he has the exact same three big table of text across every page. What should I do to eliminate the dup content. I thought about removing the script then just rewriting the table of text on every page... Is there a better solution? Any ideas would be greatly appreciated. Thanks!
Technical SEO | | BigStereo0 -
Duplicate Content on 2 Sites - Advice
We have one client who has an established eCommerce Site and has created another site which has the exact same content which is about to be launched. We want both sites to be indexed but not be penalised for duplicate content. The sites have different domains The sites have the same host We want the current site to be priority, so the new site would not be ranking higher in SERPs. Any advice on setting up canonical, author tags, alternate link tag etc Thanks Rich
Technical SEO | | SEOLeaders0 -
Google , 301 redirects, and multiple domains pointing to the same content.
Google, 301 redirects, and multiple domains pointing to the same content. This is my first post here. I would like to begin by thanking anyone in advance for their help. It is much appreciated. Secondly, I'm posting in the wrong place or something please forgive me simply point me in the right direction I'm a quick learner. I think I'm battling a redirect problem but I want to be sure before I make changes. In order to accurately assess the situation a little background is necessary. I have had a site called tx-laws.com for about 15 years. It was a site that was used primarily by private resource and as such was never SEO'd. The site itself was in fact quite Seo unfriendly. despite a complete lack of marketing or SEO efforts, over time, SEO aside, this domain eventually made it to page one of Google Yahoo and Bing under the keywords Texas laws. About six months ago I decided to revamp the site and create a new resource aimed at a public market. A good deal of effort was made to re-work the SEO. The new site was developed at a different domain name: easylawlook up.com. Within a few months this domain name surpassed tx-laws in Google and was holding its place in position number eight out of 190 million results. Note that at this point no marketing has been done, that is to say there has been no social networking, no e-mail campaigns, no blogs, -- nothing but content. All was well until a few weeks ago I decided to upgrade our network and our servers. During this period there was some downtime unfortunately. When the upgrade was complete everything seemed fine until a week or so later when our primary domain easy law look up vanished off Google. At first I thought it was downtime but now I'm not so sure. The current configuration reroutes traffic from tx-laws to easylawlookup in IIS by pointing both domains to the same root directory. Everything else was handled through scripting. As far as I know this is how it was always set up. At present there is no 301 Redirect in place for tx-laws (as I'm sure there probably should be). Interestingly enough the back links to easylaw also went away. Even more telling however is that now when I visit link: easylawlookup.com there is only one link, and that link is to a domain which references tx-laws not easy law. So it would appear that I have confused Google with regards to my actual intentions. My question is this. Right now my rankings for tx-laws remain unchanged. The last thing I want to have happen is to see those disappear as well. If easy law has somehow been penalized and I redirect tx-laws to easy through a 301 will I screw up my rankings for this domain as well? Any comments or input on the situation are welcome. I just want to think it through before I start making more changes which might make things worse instead of better. Ultimately though, there is no reason that the old domain can't be redirected to the new domain at this point unless it would mean that I run the risk of losing my listings for tx-laws, ending up with nothing instead of transferring any link juice and traffic to easy law. With regards to the down time, it was substantial over a couple of weeks with many hours off-line. However this downtime would have affected both domains the only difference being that the one domain had been in existence for 15 years as opposed to six months for the other. So is my problem downtime, lack of proper 301 redirect, or something else? and if I implement a 301 at this point do I risk damaging the remaining domain which is operational? Thanks again for any help.
Technical SEO | | Steviebone0 -
Different TLD's same content - duplicate content? - And a problem in foreign googles?
Hi, Operating from the Netherlands with customers troughout Europe we have for some countries the same content. In the netherlands and Belgium Dutch is spoken and in Germany and Switserland German is spoken. For these countries the same content is provided. Does Google see this as duplicate content? Could it be possible that a german customer gets the Swiss website as a search result when googling in the German Google? Thank you for your assistance! kind regards, Dennis Overbeek Dennis@acsi.eu
Technical SEO | | SEO_ACSI0 -
Duplicate content?
I have a question regarding a warning that I got on one of my websites, it says Duplicate content. I'm canonical url:s and is also using blocking Google out from pages that you are warning me about. The pages are not indexed by Google, why do I get the warnings? Thanks for great seotools! 3M5AY.png
Technical SEO | | bnbjbbkb0