Duplicate Content with ADN, DNS and F5 URLs
-
In my duplicate content report, there are URLs showing as duplicate content.
All of the pages work, they do not redirect, and they are used for either IT debugging or as part of a legacy system using a split DNS, QAing the site, etc...
They aren't linked (or at least, shouldn't be) on any pages, and I am not seeing them in Search Results, but Moz is picking them up. Should I be worried about duplicate content here and how should I handle them? They are replicates of the current live site, but have different subdomains.
We are doing clean up before migrating to a new CMS, so I'm not sure it's worth fixing at this point, or if it is even an issue at all. But should I make sure they are in robots or take any action to address these?
Thanks!
-
A couple more thoughts here, based on your revised question.
You'll want to figure out how those links to the rogue subdomain have been generated, so you don't just move them over to the new CMS (such as if it's in body text that gets wholesale copied without being examined).
If those old subdomains are not needed at all anymore, I'd get them removed entirely if you can, or at the very least blocked in robots.txt. You can verify each subdomain as its own site in Google Webmaster Tools, then request removal of those subdomains if the content is gone or if it's excluded in robots.txt.
You might suggest to the dev team that they password-protect things like this so they don't get accidentally crawled in the future, use robots.txt to block, etc.
If you have known dev subdomains that are needed, and you know about them as the SEO and make sure they have robots.txt on them, you might want to use a code monitoring service like https://www.polepositionweb.com/roi/codemonitor/ to monitor the contents of the robots.txt file. It will let you know if the file has been changed or removed (good idea for the main site too). I've seen dev sites copied over to live sites, and the robots.txt copied over too, so everything is now blocked on the new live site. I've also seen dev sites with a data refresh from the live site, and the robots.txt from the live site is now on the dev site, and the dev site gets indexed.
-
Thanks Keri, I received your note!
-
Hi! I have a couple of ideas, and sent you a quick email to the account on your Moz profile.
You may also find it helpful to do a google search for:
site:ourdomain.com -inurl:www
This will show you all the non-www subdomains that Google has indexed, in case some others have slipped on in and you don't want them to be indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with Duplicate Content
When I crawl my site through moz, it shows lots of Pages with Duplicate Content. The thing is all that pages are pagination pages. How should I solve this issue?
Technical SEO | | 100offdeal0 -
Canonical Tags for Legacy Duplicate Content
I've got a lot of duplicate pages, especially products, and some are new but most have been like this for a long time; up to several years. Does it makes sense to use a canonical tag pointing to one master page for each product. Each page is slightly different with a different feature and includes maybe a sentence or two that is unique but everything else is the same.
Technical SEO | | AmberHanson0 -
Crawl Diagnostics: Duplicate Content Issues
The Moz crawl diagnostic is showing that I have some duplicate content issues on my site. For the most part, these are variations of the same product that are listed individually (i.e size/color). What would be the best way to deal with this? Choose one variation of the product and add a canonical tag? Thanks
Technical SEO | | inhouseseo0 -
Rel=canonical overkill on duplicate content?
Our site has many different health centers - many of which contain duplicate content since there is topic crossover between health centers. I am using rel canonical to deal with this. My question is this: Is there a tipping point for duplicate content where Google might begin to penalize a site even if it has the rel canonical tags in place on cloned content? As an extreme example, a site could have 10 pieces of original content, but could then clone and organize this content in 5 different directories across the site each with a new url. This would ultimately result in the site having more "cloned" content than original content. Is this at all problematic even if the rel canonical is in place on all cloned content? Thanks in advance for any replies. Eric
Technical SEO | | Eric_Lifescript0 -
Duplicate Content
Hi, we need some help on resolving this duplicate content issue,. We have redirected both domains to this magento website. I guess now Google considered this as duplicate content. Our client wants both domain name to go to the same magento store. What is the safe way of letting Google know these are same company? Or this is not ideal to do this? thanks
Technical SEO | | solution.advisor0 -
Tags and Duplicate Content
Just wondering - for a lot of our sites we use tags as a way of re-grouping articles / news / blogs so all of the info on say 'government grants' can be found on one page. These /tag pages often come up with duplicate content errors, is it a big issue, how can we minimnise that?
Technical SEO | | salemtas0 -
Duplicate content error - same URL
Hi, One of my sites is reporting a duplicate content and page title error. But it is the same page? And the home page at that. The only difference in the error report is a trailing slash. www.{mysite}.co.uk www.{mysite}.co.uk/ Is this an easy htaccess fix? Many thanks TT
Technical SEO | | TheTub1 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0