Any harm and why the differences - multiple versions of same site in WMT
-
In Google Webmaster Tools we have set up:
ourdomain.co.nz
ourdomain.co.uk
ourdomain.com
ourdomain.com.au
www.ourdomain.co.nz
www.ourdomain.co.uk
www.ourdomain.com
www.ourdomain.com.au
https://www.ourdomain.co.nz
https://www.ourdomain.co.uk
https://www.ourdomain.com
https://www.ourdomain.com.auAs you can imagine, this gets confusing and hard to manage. We are wondering whether having all these domains set up in WMT could be doing any damage? Here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=44231 it says:
"If you see a message that your site is not indexed, it may be because it is indexed under a different domain. For example, if you receive a message that http://example.com is not indexed, make sure that you've also added http://www.example.com to your account (or vice versa), and check the data for that site."
The above quote suggests that there is no harm in having several versions of a site set up in WMT, however the article then goes on to say:
"Once you tell us your preferred domain name, we use that information for all future crawls of your site and indexing refreshes. For instance, if you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we follow that link as http://www.example.com instead."
This suggests that having multiple versions of the site loaded in WMT may cause Google to continue crawling multiple versions instead of only crawling the desired versions (https://www.ourdomain.com + .co.nz, .co.uk, .com.au).
However, even if Google does crawl any URLs on the non https versions of the site (ie ourdomain.com or www.ourdomain.com), these 301 to https://www.ourdomain.com anyway... so shouldn't that mean that google effectively can not crawl any non https://www versions (if it tries to they redirect)? If that was the case, you'd expect that the ourdomain.com and www.ourdomain.com versions would show no pages indexed in WMT, however the oposite is true. The ourdomain.com and www.ourdomain.com versions have plenty of pages indexed but the https versions have no data under Index Status section of WMT, but rather have this message instead:
Data for https://www.ourdomain.com/ is not available. Please try a site with http:// protocol: http://www.ourdomain.com/.
This is a problem as it means that we can't delete these profiles from our WMT account.
Any thoughts on the above would be welcome.
As an aside, it seems like WMT is picking up on the 301 redirects from all ourdomain.com or www.ourdomain.com domains at least with links - No ourdomain.com or www.ourdomain.com URLs are registering any links in WMT, suggesting that Google is seeing all links pointing to URLs on these domains as 301ing to https://www.ourdomain.com ... which is good, but again means we now can't delete https://www.ourdomain.com either, so we are stuck with 12 profiles in WMT... what a pain....
Thanks for taking the time to read the above, quite complicated, sorry!! Would love any thoughts...
-
I agree with Federico that you probably don't need to have every page be secure. Perhaps you should consider making the http://www. version your canonical default instead?
-
It is fine to have multiple versions of a site in different countries. Some of the biggest brands in the world do this. There are "right" and "wrong" ways to go about it, but if I had a ccTLD for the UK and lots of UK customers I wouldn't send them to my US site, regardless of whether I had a /uk/ folder or not.
-
Chirs,
Is the content exactly the same on all domains? Anything changes between .com, .co.uk, etc?
If so, you MUST use the canonical to only ONE version (.com would be my guess) and rel="alternate" for the other domains, however, that doesn't make any sense if the content is the same. Why not just redirect all domains to .com (or whatever definitive version you choose)?
-
Hi Frederico,
Thanks very much for your response. And yes, sorry, my initial question wasn't written so great, sorry!
ourdomain.com and www.ourdomain.com both 301 to https://www.ourdomain.com (which is also the canonical definitive version for the .com)
ourdomain.co.uk and www.ourdomain.co.uk both 301 to https://www.ourdomain.co.uk (which is also the canonical definitive version for the .co.uk)
and the same as above for .com.au domains, and .co.nz domains.
The content is the same across all domains.
The thing is that a lot of info appears in Webmaster tools under the non canonical versions of the sites, and is not showing under the canonical profile in WMT. Which makes us feel like maybe we shouldn't delete those profiles?
Regarding the HTTP vs HTTPS issues... sounds like what you are saying is that we should consider only using HTTPS on pages that really need it - at the moment it is site wide. That makes sense.
Thanks again and look forward to your thoughts as to whether there is any benefit or harm if we keep/remove the non canonical site profiles from WMT.
-
Hi Chris,
That was hard to follow. Let's start with the basics:
Do all those domains redirect to one single domain? or all those domains serve the same content but within the domain accessed?
If you redirect all domains to a single domain, using a 301 will do while having the profiles in WMT is useless. If you serve the same content within all domains, you should use canonicals pointing to a definitive version with no canonical tag. Then again, you can use WMT to track searches and links, but Google will serve one URL in their results, and that's the one all other versions are pointing in the canonical tag.
Now, are you trying to serve all your content under SSL or standard HTTP? As that causes a duplicate content issue if you are serving both, and again, you should use 301 to the verison you prefer or canonicals. There's no benefit or harm using HTTPS for all your pages, and sometimes, HTTPS could be slower as the browser has to negotiate certificates in each request (I would go with regular HTTP if you are not requesting input from your visitors or showing private information).
Am I on the right track so far?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different statistics in Google Analytics
Hi! A client of mine is having some trouble on his Google Analytics overview. When he checks the yearly report with number of visitors, new visitors, returning, ... he gets statistics that doesn't match mine. Also, he gets a notation from Google that no hits were fired, although the script is still there. On the other hand, I (logged in with another account) am seeing all statistics correctly. FYI: no filters were installed, nor segments, ... Why does the client see different and incorrect statistics? There should be a setting or something I'm overseeing. Hope you guys can help.
Reporting & Analytics | | conversal0 -
Verifying Site Ownership & Setting Up Webmaster tools for clients who use Hubspot
We are a Hubspot partner agency. I'm trying to find the best route for managing Google's tools as an extra resource for insight, not the primary basis for marketing effort. I also want to explore adwords in more depth. Finding a lot of our clients don't have one or the other or both Analytics/Webmaster tools in place. Can I verify site ownership to set up webmaster tools simply by having admin access to their analytics account or will that require ownership of the analytics account? With Google merging things together these days I'm not sure of the best approach to take. Usually clients have their site hosted somewhere and built on some platform and ADD a Hubspot blog and the landing pages/cta's, Hubspot tools on a subdomain hosted by Hubspot. Hubspot has tools in it's website settings for adding google analytics (actually it's just a field to add code to the header area). If a client has universal analytics on their primary domain do I still need to go and add a separate analytics property for the subdomain and go through Hubspot's tools to install it on the subdomain? Or just use the same code from their primary domain and add it to the Hubspot header? What is the best route? Any additional thoughts on this subject are welcome - with so much updating and changing coming from Google (and Hubspot as we implement 3.0 - COS) I'm trying to avoid wasted effort, outdated methods, etc. Thanks!
Reporting & Analytics | | rhgraves651 -
How is it possible that this site has a higher page authority than my site?
Judging by open site explorer, I'm crushing my competitor in every imaginable way. And yet, somehow they have a higher page authority than me and, consequently, are ranking higher than me. How is this possible? My site is on the left: 40atcpP.png
Reporting & Analytics | | ScottMcPherson0 -
Can 500 errors hurt rankings for an entire site or just the pages with the errors?
I'm working with a site that had over 700 500 errors after a redesign in april. Most of them were fixed in June, but there are still about 200. Can 500 errors affect rankings sitewide, or just the pages with the errors? Thanks for reading!
Reporting & Analytics | | DA20130 -
SERPS different based on location of search even with non-personalization
Hello Mozzers, Our agency's website, www.kenta.ro - ranked for a long time at #1 for "ann arbor seo" and similar keywords. For the past several (4-5) months we've been sitting around #5. My guess was that this was Google playing around with the results but I'm not sure why we have been at this position for such a long time. I have a vpn that I use for checking rankings overseas and if I connect to a server in Chicago, LA, Ontario, etc we show up as #1 - only when you search for "ann arbr seo" in our area do we get a lower ranking. All rank checking programs including seomoz show us at the #1 position because of this. What this means for us is that all of the traffic we target with this keyword sees the poor result, while the rest of the world sees the great result (should they search for it). How can we ensure that our target market finds us at #1 like the rest of the world does? Thank you in advance. weabi.png
Reporting & Analytics | | kentaro-2569290 -
Segmenting traffic from referring sites in GA
Most of our traffic is from Referring sites, and in referring sites, job sites are sending most of the traffic. How can we segment traffic from job sites. There are about 40 such sites. We would like to receive a report which shows traffic excluding from these job sites.
Reporting & Analytics | | seoug_20050 -
Site crawler hasn't crawled my site in 6 days!
On 4.23 i requested a site crawl. My site only has about 550 pages. So how can we get faster crawls?
Reporting & Analytics | | joemas990 -
Setting up Google Analytic Goals to a 3rd Party Site
I recently received help on a question I asked on SEOmoz but need additional clarification. I am trying to set up goals in Google Analytics for people who click on a “purchase botton” which sends them to PayPal. I created a Thank You page and tried to get PayPal to redirect to it, however, our customers only get to our site’s 404 page. Here is what I’ve done so far: Went into my PayPal account and turned the “Auto Return” to ‘on’ Under website payment preferences, I added the following URL http://www.teecycle.org/thank-youutm_nooverride1. (I formatted the URL this way because the person who provided me with help recommended using the format ?UTM_nooverride=1. However, our CMS system won’t allow “?” or “=”)
Reporting & Analytics | | EricVallee340