Any harm and why the differences - multiple versions of same site in WMT
-
In Google Webmaster Tools we have set up:
ourdomain.co.nz
ourdomain.co.uk
ourdomain.com
ourdomain.com.au
www.ourdomain.co.nz
www.ourdomain.co.uk
www.ourdomain.com
www.ourdomain.com.au
https://www.ourdomain.co.nz
https://www.ourdomain.co.uk
https://www.ourdomain.com
https://www.ourdomain.com.auAs you can imagine, this gets confusing and hard to manage. We are wondering whether having all these domains set up in WMT could be doing any damage? Here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=44231 it says:
"If you see a message that your site is not indexed, it may be because it is indexed under a different domain. For example, if you receive a message that http://example.com is not indexed, make sure that you've also added http://www.example.com to your account (or vice versa), and check the data for that site."
The above quote suggests that there is no harm in having several versions of a site set up in WMT, however the article then goes on to say:
"Once you tell us your preferred domain name, we use that information for all future crawls of your site and indexing refreshes. For instance, if you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we follow that link as http://www.example.com instead."
This suggests that having multiple versions of the site loaded in WMT may cause Google to continue crawling multiple versions instead of only crawling the desired versions (https://www.ourdomain.com + .co.nz, .co.uk, .com.au).
However, even if Google does crawl any URLs on the non https versions of the site (ie ourdomain.com or www.ourdomain.com), these 301 to https://www.ourdomain.com anyway... so shouldn't that mean that google effectively can not crawl any non https://www versions (if it tries to they redirect)? If that was the case, you'd expect that the ourdomain.com and www.ourdomain.com versions would show no pages indexed in WMT, however the oposite is true. The ourdomain.com and www.ourdomain.com versions have plenty of pages indexed but the https versions have no data under Index Status section of WMT, but rather have this message instead:
Data for https://www.ourdomain.com/ is not available. Please try a site with http:// protocol: http://www.ourdomain.com/.
This is a problem as it means that we can't delete these profiles from our WMT account.
Any thoughts on the above would be welcome.
As an aside, it seems like WMT is picking up on the 301 redirects from all ourdomain.com or www.ourdomain.com domains at least with links - No ourdomain.com or www.ourdomain.com URLs are registering any links in WMT, suggesting that Google is seeing all links pointing to URLs on these domains as 301ing to https://www.ourdomain.com ... which is good, but again means we now can't delete https://www.ourdomain.com either, so we are stuck with 12 profiles in WMT... what a pain....
Thanks for taking the time to read the above, quite complicated, sorry!! Would love any thoughts...
-
I agree with Federico that you probably don't need to have every page be secure. Perhaps you should consider making the http://www. version your canonical default instead?
-
It is fine to have multiple versions of a site in different countries. Some of the biggest brands in the world do this. There are "right" and "wrong" ways to go about it, but if I had a ccTLD for the UK and lots of UK customers I wouldn't send them to my US site, regardless of whether I had a /uk/ folder or not.
-
Chirs,
Is the content exactly the same on all domains? Anything changes between .com, .co.uk, etc?
If so, you MUST use the canonical to only ONE version (.com would be my guess) and rel="alternate" for the other domains, however, that doesn't make any sense if the content is the same. Why not just redirect all domains to .com (or whatever definitive version you choose)?
-
Hi Frederico,
Thanks very much for your response. And yes, sorry, my initial question wasn't written so great, sorry!
ourdomain.com and www.ourdomain.com both 301 to https://www.ourdomain.com (which is also the canonical definitive version for the .com)
ourdomain.co.uk and www.ourdomain.co.uk both 301 to https://www.ourdomain.co.uk (which is also the canonical definitive version for the .co.uk)
and the same as above for .com.au domains, and .co.nz domains.
The content is the same across all domains.
The thing is that a lot of info appears in Webmaster tools under the non canonical versions of the sites, and is not showing under the canonical profile in WMT. Which makes us feel like maybe we shouldn't delete those profiles?
Regarding the HTTP vs HTTPS issues... sounds like what you are saying is that we should consider only using HTTPS on pages that really need it - at the moment it is site wide. That makes sense.
Thanks again and look forward to your thoughts as to whether there is any benefit or harm if we keep/remove the non canonical site profiles from WMT.
-
Hi Chris,
That was hard to follow. Let's start with the basics:
Do all those domains redirect to one single domain? or all those domains serve the same content but within the domain accessed?
If you redirect all domains to a single domain, using a 301 will do while having the profiles in WMT is useless. If you serve the same content within all domains, you should use canonicals pointing to a definitive version with no canonical tag. Then again, you can use WMT to track searches and links, but Google will serve one URL in their results, and that's the one all other versions are pointing in the canonical tag.
Now, are you trying to serve all your content under SSL or standard HTTP? As that causes a duplicate content issue if you are serving both, and again, you should use 301 to the verison you prefer or canonicals. There's no benefit or harm using HTTPS for all your pages, and sometimes, HTTPS could be slower as the browser has to negotiate certificates in each request (I would go with regular HTTP if you are not requesting input from your visitors or showing private information).
Am I on the right track so far?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best to Leave Toxic Links or Remove/Disovow on Site with Low Number of Linking Domains
Our site has only 87 referring domains (with at least 7,100 incoming links). LinkDetox has identified 29% of our back links as being toxic and 14% as being questionable. Virtually all of these links derive from spammy sites. We never received a manual penalty, but ever since the first Penguin penalty in 2012 our search volume and ranking has dropped with some uneven recover in the last 3 years. By removing/disavowing toxic links are we risking that over optimized link text will be removed and that ranking will suffer as a result? Are we potentially shooting ourselves in the foot? Would we be better to spend a few months building quality links from reputable domains before removing disavowing bad links? Or toxic links (as defined by LinkDetox) so bad that it should be a priority to remove them immediately before taking any other step? Thanks, Alan
Reporting & Analytics | | Kingalan10 -
What determines the page order of site:domain?
Whenever I use site:domain.com to check what's index, it's pretty much always in the same order. I gather from this, the order is not random. I'm also reasonably certainly it isn't related to any page strength signals or ranking results. So, does anyone know why the pages are displayed in the order they are? What information does the order of the pages tell me? Thanks, Ruben
Reporting & Analytics | | KempRugeLawGroup1 -
Webmaster Tools Suddenly Asking For Verification of Site Registered for 5 Years
Google Webmaster Tools has been successfully installed on my website, (www.nyc-officespace-leader.com) for more than five years. Suddenly, today I have received a request to Verify this Site". This makes no sense. The only possibility I can think of is that this is somehow tied to the following events in the last month: 1. Launch of new version of website on June 4th
Reporting & Analytics | | Kingalan1
2. Installation of Google of Tag Manager
3. Sudden Increase in number of pages indexed by Google. Unexplained indexing of an additional 175 pages. About 625 pages should be indexed, while 800 are now indexed. In the last month ranking and traffic have fallen sharply. Could it be tat these issues are all linked? But the strangest issue is the request to verify the site. Does anyone have any ideas? Thanks,
Alan0 -
What is the best way to embed Google Analytics charts on our site?
We want to build out this functionality so a client can log into our site and view data. I know its possible but I can't find any articles about the steps so that my team can move forward.
Reporting & Analytics | | appbackr0 -
Site Crash Effect On Traffic
All, I manage a site that unfortunately crashed due to a server issue in late October for about 3 hours. Prior to the crash, traffic was the best it had ever been in the 3+ year history of the site. As you might expect, since the crash traffic has gone gradually down and is now about 15% off pre-crash numbers. I understand that when a site crashes, it disrupts the crawling process and can disrupt traffic (in my case rich snippets were thrown off for days) but would love to hear experiences any of you have had in similar situations. How much did traffic drop after a crash? When did it recover? Other thoughts? Thanks, John
Reporting & Analytics | | JSOC0 -
Expert Google Analytics: store with multiple languages and multiple sub-domians
I hope there's some hardcore Analytics shark out there, with a quick reply 🙂 I am setting up a store with multiple languages for a client and need to be able to track multiple sub-domains as a multiple languages in analytics. Example: dk.somedomain.com -> Danish language
Reporting & Analytics | | ReneReinholdt
www.somedomain.com -> English language
no.somedomain.com -> Norweigen language
.. and so on Now what the client would like is to have one single entity with multiple profiles in analytics, like this: somedomain.com
-> www.somedomain.com
-> dk.somedomain.com
-> no.somedomain.com
ex.. So if the client want's to see stats for the English language then he just vlivks the www profile and in case of danish he clicks the dk profile and so on.
problem here is I can't find an analytics help that addresses this specific issue. I have found this but that doesn't seem to cover it:. Any pointers would be greatly appreciated. 🙂0 -
How to export all my keywords from GA new version
Hi I want to know how can i download all my keywords within the kw report of the google analytics new version. It seems to me that the "limit=50000" parameter doesn't work on this new version. If that is true how can i do to export all my keywords let say on CSV format? Thanks a lot
Reporting & Analytics | | FranckNlemba0 -
New Google Analytics Site Speed tool and excel
Hello, I was wondering if there is a good tool or method to pull the new Google Analytics Site Speed data into excel and use this document to track site speeds on a weekly basis for multiple clients? Any good articles or how-to's would be awesome!
Reporting & Analytics | | Hakkasan0