Duplicate content across multiple domains
-
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain.
My question relates to the removal of these pages. There are thousands of these duplicate pages.
I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen?
Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site.
I suppose another option would be to include no cache meta tag for these pages.
Any thoughts or comments would be appreciated.
-
I went ahead and added the links to the sitemap, however when google crawled the links I receieve this message.
When we tested a sample of URLs from your Sitemap, we found that some URLs redirect to other locations. We recommend that your Sitemap contain URLs that point to the final destination (the redirect target) instead of redirecting to another URL.
However I do not understand how adding the redirected links to the sitemap will remove the old links.
-
Worth a shot. Crawl bots usually work by following links from page to the next. If links links no longer exist to those pages, then Google will have a tough time finding those pages and de-indexing them in favor or the correct pages.
Good luck!
-
One of the previous developers left a hole that caused this issue. The system shares code between sites.
-
Andrew,
The links were removed from the offending sites, but If I understand the gist of your suggestion Google won't remove them as quickly if they are no longer linked and yes I am using canonical tags. So I should create a sitemap with the previous links and once Google follows these links to the main site remove the sitemap. Is that your recommendation?
I suppose I can try this first before filing a request to remove the entire site.
-
Ah, I thought he was saying the dupe content does still exists but no more duplication is taking place after the fix. That's where I was going wrong then lol.
-
As long as the duplicate content pages no longer exist and you've set up the 301 redirects properly, this shouldn't be a long term problem. It can sometimes take Google a while to crawl through 1000's of pages to index the correct pages. You might want to include these pages in a Sitemap to speed up the process, particularly if there are no longer any links to these pages from anywhere else. Are you using canonical tags? They might also help point Google in the right direction.
I don't think a no cache meta tag would help. This is assuming the page will be crawled and by that point Google should follow the 301 and cace that page.
Hope this helps! Let me know how the situation progresses.
Andrew
-
Do you want the smaller sites to still exist? If they don't matter at all then you could always take them offline though that's not recommended for obvious reasons (but it would get them out of the index fairly quick).
If they still need to exist then we're just back to the same thing, changing the content on them. If the problem has been fixed to stop further duplication then that's fine... you could limit the damage by having all of those smaller sites be dupes of each other but not of the main site by rewriting the smaller ones with one lot of content, or the main one. At least that way they will only be competing with each other and not the main site any more.
Or have I still got the wrong end of the stick?
-
I am referring to an e-commerce site, so yes its dynamic. The hole has been plugged (so to speak) but the content still exists in the google cache.
-
Ah I see, so it's a CMS which pumps out content then?
But it pumps it to other sites?
-
Steve, Maybe I haven't explained the issue in enough detail. The duplicate content issue is related to a technical issue with the site causing the content to be duplicated when it should not have been. Its not a matter of rewriting content. My issue deals with purging this content from these other domains so that the main domain can be indexed with this content.
-
You could always just rewrite the content so it's not duplicate, that way you get to keep them cached and maybe focus on some different but still targeted long tail traffic... turn a negative into a positive. I accept thousands of pages is a lot of work, but there's a million and one online copywriters who are pretty good (and cheap) that you could assign projects to for it. Google copywriters for hire or freelance copywriters... could have it done in no time and not spend that much
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect multiple domains to 1 domain or not?
Hi there, I have client who has multiple domains that already have some PA and DA. Problem is that most websites have the same content and rank better on different keywords.
Technical SEO | | Leaf-a-mark
I want to redirect all the websites to 1 domain because it’s easier to manage and it removes any duplicate content. Question is if I redirect domain x to domain y do the rankings of domain x increase on domain y? Or is it better to keep domain x separately to generate more referral traffic to domain y? Thanks in advance! Cheers0 -
PR / News stories across multiple sites - is it still duplicate content?
I was wondering does Google make an exception for news stories where duplicate content is concerned? After all depending on the story there can be a lot of quotes and bulk blocks of the same details. Is Google intelligent enough to distinguish between general website content and actual news stories? Also like a lot of big firms we publish news stories on our website, but then they get passed on to other websites in the form of PR, and then published on other websites. So if we put it on our website, then within a few hours or the same day other websites publish the story at the same time (literally copied and pasted) - how does this affect our website in terms of duplicate content? Will Google know automatically that we published it first? Thanks!
Technical SEO | | Brabian0 -
How different should content be so that it is not considered duplicate?
I am making a 2nd website for the same company. The name of the company, our services, keywords and contact info will show up several times within the text of both websites. The overall text and paragraphs will be different but some info may be repeated on both sites. Should I continue this? What precautions should I take?
Technical SEO | | savva0 -
A problem with duplicate content
I'm kind of new at this. My crawl anaylsis says that I have a problem with duplicate content. I set the site up so that web sections appear in a folder with an index page as a landing page for that section. The URL would look like: www.myweb.com/section/index.php The crawl analysis says that both that URL and its root: www.myweb.com/section/ have been indexed. So I appear to have a situation where the page has been indexed twice and is a duplicate of itself. What can I do to remedy this? And, what steps should i take to get the pages re-indexed so that this type of duplication is avoided? I hope this makes sense! Any help gratefully received. Iain
Technical SEO | | iain0 -
Multiple domain SEO strategy
Hi Mozzers I'm an AM at a web dev. We're building a new site for a client who sells paint to different markets: Paint for boats Paint for construction industry Paint for, well you get the idea! Would we be better off setting up separate domains - boatpaintxxx.com, housepaintxxx.com, etc - and treat each as a searate microsites for standalone SEO activity or have them as individual pages/sub doms from a single domain - paints4all.com or something? From what i've read today, including the excellent Beginners Guide - I'm guessing there's no definitive answer! Feedback appreciated! Thanks.
Technical SEO | | rikmon0 -
Duplicate Content Issues - Should I build a new site?
I'm currently working on a site which is built using Zen Cart. The client also has another version which has the same products on it. The product descriptions and the vast majority of the text has been re-written. I've used the duplicate content tool and these are the results: HTML fingerprint: 0000a7ee1f07a131 0000a7ec1f07a931 92.31% Total HTML similarity: 76.33% Standard text similarity: 66.72% Smart text similarity: 45.81% Total text similarity 56.27% I considered using a different eCommerce system like Magento or Volusion. So I had a look at a few templates, chose one and then used the tool again and got the following: HTML fingerprint: 0000a7e41b012111 0000a7ec1f07a931 72.00% Total HTML similarity: 64.65% Standard text similarity: 11.69% Smart text similarity: 17.90% Total text similarity 14.80% Do you think its worth doing this? thanks Dan
Technical SEO | | TheYeti0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240 -
Duplicate Content
We have a main sales page and then we have a country specific sales page for about 250 countries. The country specific pages are identical to the main sales page, with the small addition of a country flag and the country name in the h1. I have added a rel canonical tag to all country pages to send the link juice and authority to the main page, because they would be all competing for rankings. I was wondering if having the 250+ indexed pages of duplicate content will effect the ranking of the main page even though they have rel canonical tag. We get some traffic to country pages, but not as much as the main page, but im worried that if we remove those pages and redirect all to main page that we will loose 250 plus indexed pages where we can get traffic through for odd country specific terms. eg searching for uk mobile phone brings up the country specific page instead of main sales page even though the uk sales pages is not optimized for uk terms other than having a flag and the country name in the h1. Any advice?
Technical SEO | | -Al-0