Rel canonical and duplicate subdomains
-
Hi,
I'm working with a site that has multiple sub domains of entirely duplicate content. So, the production level site that visitors see is (for made-up illustrative example):
Then, there are sub domains which are used by different developers to work on their own changes to the production site, before those changes are pushed to production:
Google ends up indexing these duplicate sub domains, which is of course not good.
If we add a canonical tag to the head section of the production page (and therefor all of the duplicate sub domains) will that cause some kind of problem... having a canonical tag on a page pointing to itself? Is it okay to have a canonical tag on a page pointing to that same page?
To complete the example...
In this example, where our production page is 123abc456.edu, our canonical tag on all pages (this page and therefor the duplicate subdomains) would be:
Is that going to be okay and fix this without causing some new problem of a canonical tag pointing to the page it's on?
Thanks!
-
Hi Bob,
That excellent question I'll have to look in to and confirm. More later. Thanks!
-
Is the subdomain data stored on the server as directories?
So for example, is the Moe.123abc456.edu data stored in a folder like 123abc456.edu/Moe
If so, you can simply have one robots.txt on your root domain, blocking those directories
Disallow: /Moe/
-
Well, Bob, it looks like you're right! I guess it will for sure see all the pages in
as the ones to remove and not
Also, how does that robots text not get pushed to production as the developer working on that branch completes his work and pushes it to production.
I must confess, it still feels a little like bomb disposal.
-
This should be exactly what you need: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663427
-
Hi Bob,
Thanks for the suggestion/question. I'm thinking about that, but wouldn't putting some robots do not crawl text on pages already indexed be a little like closing the barn door after the horses left? Do you think it would un-index the already crawled sub-domain? Thanks!
-
Assuming that you do not need the development environments indexed in Google, why not simply block all crawlers on those subdomains?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Homepage - How to fix?
Hi Everyone, I've tried using BeamUsUp SEO Crawler and have found one warning and two errors on our site. The warning is for a duplicate meta description, and the errors are a duplicate page and a duplicate title. For each problem it's showing the same two pages as the source of the error, but one has a slash at the end and one doesn't. They're both for the homepage. https://www.url.com/ And https://www.url.com Has anyone seen this before? Does anyone know if this is anything we should worry about?
Intermediate & Advanced SEO | | rswhtn1 -
Duplicate Contact Information
My clients has had a website for many years, and his business for decades. He has always had a second website domain which is basically a shopping module for obtaining information, comparisons, and quotes for tires. This tire module had no informational pages or contact info. Until recently, we pulled this information in through iframes. Now however the tire module is too complex and we do not bring in this info through iframes, and because of the way this module is configured (or website framework), we are told we can not place it as a sub-directory. So now this tire module resides on another domain name (although similar to the client's "main site" domain name) with some duplicate informational pages (I am working through this with the client), but mainly I am concerned about the duplicate contact info -- address and phone. Should I worry that this other tire website has duplicated the client's phone and address, same as their main website? And would having a subdomain (tires.example.com) work better for Google and SEO considering the duplicate contact info? Any help is much appreciated. ccee bar (And, too, The client is directing AdWords campaigns to this other website for tires, while under the same AdWords account directing other campaigns to their main site? - I have advised an entirely separate AdWords account for links to the tire domain. BTW the client does NOT have separate social media accounts for each site -- all social media efforts and links are for the main site.)
Intermediate & Advanced SEO | | cceebar0 -
Subdomain optimization - advices
Hi, I need some specific advices on which is the best way to optimize the subdomain of a main domain. Besides meta title, description, etc. Br.
Intermediate & Advanced SEO | | Tormar0 -
Google ignoring Canonical and choosing its own
Hey Mozzers, We have several products that all have upto 6 different versions, they are the same product but in a different specification. As users search via these specifications (within our website) it is beneficial to keep all 6 products as different listings on the website. In google however it is not. So we kept all 6 listing but chose 1 to be the google landing page, the only different between them all is the technical specification + occasionally size. But 95% of the pages are the same. Let call the products A, B, C, D, E, F, we made all the canonicals point to C because this is out best selling version of the product. However, google has chosen E to rank instead. What is my best move here? Should i accept the page google has chosen and change the canonicals the point to that version or should I be stubborn and try to get google to change which version it ranks. As always many thanks.
Intermediate & Advanced SEO | | ATP0 -
Duplicate Page Content
We have different plans that you can signup for - how can we rectify the duplicate page content and title issue here? Thanks. | http://signup.directiq.com/?plan=100 | 0 | 1 | 32 | 1 | 200 |
Intermediate & Advanced SEO | | directiq
| http://signup.directiq.com/?plan=104 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=116 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=117 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=102 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=119 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=101 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=103 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=5 |0 -
Duplicate Content and Titles
Hi Mozzers, I saw a considerable amount of duplicate content and page titles on our clients website. We are just implementing a fix in the CMS to make sure that these are all fixed. What changes do you think I could see in terms of rankings?
Intermediate & Advanced SEO | | KarlBantleman0 -
Duplicate content mess
One website I'm working with keeps a HTML archive of content from various magazines they publish. Some articles were repeated across different magazines, sometimes up to 5 times. These articles were also used as content elsewhere on the same website, resulting in up to 10 duplicates of the same article on one website. With regards to the 5 that are duplicates but not contained in the magazine, I can delete (resulting in 404) all but the highest value of each (most don't have any external links). There are hundreds of occurrences of this and it seems unfeasible to 301 or noindex them. After seeing how their system works I can canonical the remaining duplicate that isn't contained in the magazine to the corresponding original magazine version - but I can't canonical any of the other versions in the magazines to the original. I can't delete the other duplicates as they're part of the content of a particular issue of a magazine. The best thing I can think of doing is adding a link in the magazine duplicates to the original article, something along the lines of "This article originally appeared in...", though I get the impression the client wouldn't want to reveal that they used to share so much content across different magazines. The duplicate pages across the different magazines do differ slightly as a result of the different Contents menu for each magazine. Do you think it's a case of what I'm doing will be better than how it was, or is there something further I can do? Is adding the links enough? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
Should I do something about this duplicate content? If so, what?
On our real estate site we have our office listings displayed. The listings are generated from a scraping script that I wrote. As such, all of our listings have the exact same description snippet as every other agent in our office. The rest of the page consists of site-wide sidebars and a contact form. The title of the page is the address of the house and so is the H1 tag. Manually changing the descriptions is not an option. Do you think it would help to have some randomly generated stuff on the page such as "similar listings"? Any other ideas? Thanks!
Intermediate & Advanced SEO | | MarieHaynes0