Duplicate Content with ADN, DNS and F5 URLs
-
In my duplicate content report, there are URLs showing as duplicate content.
All of the pages work, they do not redirect, and they are used for either IT debugging or as part of a legacy system using a split DNS, QAing the site, etc...
They aren't linked (or at least, shouldn't be) on any pages, and I am not seeing them in Search Results, but Moz is picking them up. Should I be worried about duplicate content here and how should I handle them? They are replicates of the current live site, but have different subdomains.
We are doing clean up before migrating to a new CMS, so I'm not sure it's worth fixing at this point, or if it is even an issue at all. But should I make sure they are in robots or take any action to address these?
Thanks!
-
A couple more thoughts here, based on your revised question.
You'll want to figure out how those links to the rogue subdomain have been generated, so you don't just move them over to the new CMS (such as if it's in body text that gets wholesale copied without being examined).
If those old subdomains are not needed at all anymore, I'd get them removed entirely if you can, or at the very least blocked in robots.txt. You can verify each subdomain as its own site in Google Webmaster Tools, then request removal of those subdomains if the content is gone or if it's excluded in robots.txt.
You might suggest to the dev team that they password-protect things like this so they don't get accidentally crawled in the future, use robots.txt to block, etc.
If you have known dev subdomains that are needed, and you know about them as the SEO and make sure they have robots.txt on them, you might want to use a code monitoring service like https://www.polepositionweb.com/roi/codemonitor/ to monitor the contents of the robots.txt file. It will let you know if the file has been changed or removed (good idea for the main site too). I've seen dev sites copied over to live sites, and the robots.txt copied over too, so everything is now blocked on the new live site. I've also seen dev sites with a data refresh from the live site, and the robots.txt from the live site is now on the dev site, and the dev site gets indexed.
-
Thanks Keri, I received your note!
-
Hi! I have a couple of ideas, and sent you a quick email to the account on your Moz profile.
You may also find it helpful to do a google search for:
site:ourdomain.com -inurl:www
This will show you all the non-www subdomains that Google has indexed, in case some others have slipped on in and you don't want them to be indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Duplicate Content
We have multiple collections being flagged as duplicate content - but I can't find where these duplications are coming from? The duplicate content has no introductory text, and no meta description. Please see examples:- This is the correct collection page:-
Technical SEO | | Caroline_Ardmoor
https://www.ardmoor.co.uk/collections/deerhunter This is the incorrect collection page:-
https://www.ardmoor.co.uk/collections/vendors How do I stop this incorrect page from showing?0 -
Duplicate content issue
Hi, A client of ours has one URL for the moment (https://aalst.mobilepoint.be/) and wants to create a second one with exactly the same content (https://deinze.mobilepoint.be/). Will that mean Google punishes the second one because of duplicate content? What are the recommendations?
Technical SEO | | conversal0 -
Quickview popup duplicate content
Hi We have an eccomerce site. We just added to the product list view a quickview tab - when you roll mouse over it a popup window with the product image and short description shows up - is this a problem of duplicate content( its the same content that's on the product pages except there we also have a long detailed description) - t is done with javascript. Thanks!
Technical SEO | | henya0 -
Wordpress tags and duplicate content?
I've seen a few other Q&A posts on this but I haven't found a complete answer. I read somewhere a while ago that you can use as many tags as you would like. I found that I rank for each tag I used. For example, I could rank for best night clubs in san antonio, good best night clubs in san antonio, great best night clubs in san antonio, top best night clubs in san antonio, etc. However, I now see that I'm creating a ton of duplicate content. Is there any way to set a canonical tag on the tag pages to link back to the original post so that I still keep my rankings? Would future tags be ignored if I did this?
Technical SEO | | howlusa0 -
Duplicate content issue with Wordpress tags?
Would Google really discount duplicate content created by Wordpress tags? I find it hard to believe considering tags are on and indexed by default and the vast majority of users would not know to deindex them . . .
Technical SEO | | BlueLinkERP0 -
Duplicate Content: Canonicalization vs. Redirects
Hi all, I have a client that I recently started working with whose site was built with the following structure: domain.com
Technical SEO | | marisolmarketing
domain.com/default.asp Essentially, there is a /default.asp version of every single page on the site. That said, I'm trying to figure out the easiest/most efficient way to fix all the /default.asp pages...whether that be 301 redirecting them to the .com version, adding a canonical tag to every .asp page, or simply NOINDEXing the .asp pages. I've seen a few other questions on here that are similar, but none that really say which would be the easiest way to accomplish this without going through every single page... Thanks in advance!0 -
Tired of finding solution for duplicate contents.
Just my site was scanned by seomoz and seen lots of duplicate content and titles found. Well I am tired of finding solutions of duplicate content for a shopping site product category page. You can see the screenshot below. http://i.imgur.com/TXPretv.png You can see below in every link its showing "items_per_page=64, 128 etc.". This happened in every category in which I was created. I am already using Canonical add-on to avoid this problem but still it's there. You can check my domain here - http://www.plugnbuy.com/computer-software/pc-security/antivirus-internet-security/ and see if the add-on working correct. I recently submitted my sitemap to GWT, so that's why it's not showing me any report regarding duplicate issues. Please help ME
Technical SEO | | chandubaba0 -
Duplicate Page Warnings, hkey and repetitive URLs
Hi, we just put our association's site in SEO Moz so we can tackle SEO and we received thousands of duplicate page content and duplicate title warnings. I searched the forum before asking 🙂 Appreciate some guidance on how many things are wrong with the URL's below that are getting flagged as duplicate page content. 1. Does the repetition of the page title and section hurt SEO? 2. Does the iMIS15 in the URL (the server) detract from relevant ranking? 3. From the forum, it looks like canonical tags should be added to the version of the page that starts with .....?hkey (is there a way to predict and "canonize" these? Or recommendations?) http://www.iiba.org/imis15/IIBA/About_IIBA/IIBA_Website/About_IIBA/About_IIBA.aspx?hkey=6d821afa-e3aa-4bef-bafa-2453238d12c6 http://www.iiba.org/imis15/IIBA/About_IIBA/IIBA_Website/About_IIBA/About_IIBA.aspx Thank you.
Technical SEO | | lyndas0