Any harm and why the differences - multiple versions of same site in WMT
-
In Google Webmaster Tools we have set up:
ourdomain.co.nz
ourdomain.co.uk
ourdomain.com
ourdomain.com.au
www.ourdomain.co.nz
www.ourdomain.co.uk
www.ourdomain.com
www.ourdomain.com.au
https://www.ourdomain.co.nz
https://www.ourdomain.co.uk
https://www.ourdomain.com
https://www.ourdomain.com.auAs you can imagine, this gets confusing and hard to manage. We are wondering whether having all these domains set up in WMT could be doing any damage? Here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=44231 it says:
"If you see a message that your site is not indexed, it may be because it is indexed under a different domain. For example, if you receive a message that http://example.com is not indexed, make sure that you've also added http://www.example.com to your account (or vice versa), and check the data for that site."
The above quote suggests that there is no harm in having several versions of a site set up in WMT, however the article then goes on to say:
"Once you tell us your preferred domain name, we use that information for all future crawls of your site and indexing refreshes. For instance, if you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we follow that link as http://www.example.com instead."
This suggests that having multiple versions of the site loaded in WMT may cause Google to continue crawling multiple versions instead of only crawling the desired versions (https://www.ourdomain.com + .co.nz, .co.uk, .com.au).
However, even if Google does crawl any URLs on the non https versions of the site (ie ourdomain.com or www.ourdomain.com), these 301 to https://www.ourdomain.com anyway... so shouldn't that mean that google effectively can not crawl any non https://www versions (if it tries to they redirect)? If that was the case, you'd expect that the ourdomain.com and www.ourdomain.com versions would show no pages indexed in WMT, however the oposite is true. The ourdomain.com and www.ourdomain.com versions have plenty of pages indexed but the https versions have no data under Index Status section of WMT, but rather have this message instead:
Data for https://www.ourdomain.com/ is not available. Please try a site with http:// protocol: http://www.ourdomain.com/.
This is a problem as it means that we can't delete these profiles from our WMT account.
Any thoughts on the above would be welcome.
As an aside, it seems like WMT is picking up on the 301 redirects from all ourdomain.com or www.ourdomain.com domains at least with links - No ourdomain.com or www.ourdomain.com URLs are registering any links in WMT, suggesting that Google is seeing all links pointing to URLs on these domains as 301ing to https://www.ourdomain.com ... which is good, but again means we now can't delete https://www.ourdomain.com either, so we are stuck with 12 profiles in WMT... what a pain....
Thanks for taking the time to read the above, quite complicated, sorry!! Would love any thoughts...
-
I agree with Federico that you probably don't need to have every page be secure. Perhaps you should consider making the http://www. version your canonical default instead?
-
It is fine to have multiple versions of a site in different countries. Some of the biggest brands in the world do this. There are "right" and "wrong" ways to go about it, but if I had a ccTLD for the UK and lots of UK customers I wouldn't send them to my US site, regardless of whether I had a /uk/ folder or not.
-
Chirs,
Is the content exactly the same on all domains? Anything changes between .com, .co.uk, etc?
If so, you MUST use the canonical to only ONE version (.com would be my guess) and rel="alternate" for the other domains, however, that doesn't make any sense if the content is the same. Why not just redirect all domains to .com (or whatever definitive version you choose)?
-
Hi Frederico,
Thanks very much for your response. And yes, sorry, my initial question wasn't written so great, sorry!
ourdomain.com and www.ourdomain.com both 301 to https://www.ourdomain.com (which is also the canonical definitive version for the .com)
ourdomain.co.uk and www.ourdomain.co.uk both 301 to https://www.ourdomain.co.uk (which is also the canonical definitive version for the .co.uk)
and the same as above for .com.au domains, and .co.nz domains.
The content is the same across all domains.
The thing is that a lot of info appears in Webmaster tools under the non canonical versions of the sites, and is not showing under the canonical profile in WMT. Which makes us feel like maybe we shouldn't delete those profiles?
Regarding the HTTP vs HTTPS issues... sounds like what you are saying is that we should consider only using HTTPS on pages that really need it - at the moment it is site wide. That makes sense.
Thanks again and look forward to your thoughts as to whether there is any benefit or harm if we keep/remove the non canonical site profiles from WMT.
-
Hi Chris,
That was hard to follow. Let's start with the basics:
Do all those domains redirect to one single domain? or all those domains serve the same content but within the domain accessed?
If you redirect all domains to a single domain, using a 301 will do while having the profiles in WMT is useless. If you serve the same content within all domains, you should use canonicals pointing to a definitive version with no canonical tag. Then again, you can use WMT to track searches and links, but Google will serve one URL in their results, and that's the one all other versions are pointing in the canonical tag.
Now, are you trying to serve all your content under SSL or standard HTTP? As that causes a duplicate content issue if you are serving both, and again, you should use 301 to the verison you prefer or canonicals. There's no benefit or harm using HTTPS for all your pages, and sometimes, HTTPS could be slower as the browser has to negotiate certificates in each request (I would go with regular HTTP if you are not requesting input from your visitors or showing private information).
Am I on the right track so far?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PDF web traffic hitting our site
Hi there, Over the last few months our traffic has spiked due to irrelevant pdf documents sending us crap traffic, our bounce rate is sky high as well as other metrics. I don't want to just filter out this traffic in GA rather try and stop our site from being attacked. Any advice on a way forward would be great. Thanks
Reporting & Analytics | | ICMPmarketing0 -
GA Internal Site Search Correct Query Parameter?
Hi Guys, Recently added GA internal site search to a sub-folder: http://tinyurl.com/jhm9cyl Just want to confirm would the query parameter be: /search/ Or different because it's a sub-folder? Cheers.
Reporting & Analytics | | jayoliverwright0 -
Free Media Site / High Traffic / Low Engagement / Strategies and Questions
Hi, Imagine a site "mediapalooza dot com" where the only thing you do there is view free media. Yet Google Analytics is showing the average view of a media page is about a minute; where the average length of media is 20 - 90 minutes. And imagine that most of this media is "classic" and that it is generally not available elsewhere. Note also that the site ranks terribly in Google, despite having decent Domain Authority (in the high 30's), Page Authority in the mid 40's and a great site and otherwise quite active international user base with page views in the tens of thousands per month. Is it possible that GA is not tracking engagement (time on site) correctly? Even accounting for the imperfect method of GA that measures "next key pressed" as a way to terminate the page as a way to measure time on page, our stats are truly abysmal, in the tenths of a percentage point of time measured when compared with actual time we think the pages are being used. If so, will getting engagement tracking to more accurately measure time on specif pages and site signal Google that this site is actually more important than current ranking indicates? There's lots of discussion about "dwell time" as this relates to ranking, and I'm postulating that if we can show Google that we have extremely good engagement instead of the super low stats that we are reporting now, then we might get a boost in ranking. Am I crazy? Has anyone got any data that proves or disproves this theory? as I write this out, I detect many issues - let's have a discussion on what else might be happening here. We already know that low engagement = low ranking. Will fixing GA to show true engagement have any noticeable impact on ranking? Can't wait to see what the MOZZERS think of this!
Reporting & Analytics | | seo_plus0 -
Verifying Site Ownership & Setting Up Webmaster tools for clients who use Hubspot
We are a Hubspot partner agency. I'm trying to find the best route for managing Google's tools as an extra resource for insight, not the primary basis for marketing effort. I also want to explore adwords in more depth. Finding a lot of our clients don't have one or the other or both Analytics/Webmaster tools in place. Can I verify site ownership to set up webmaster tools simply by having admin access to their analytics account or will that require ownership of the analytics account? With Google merging things together these days I'm not sure of the best approach to take. Usually clients have their site hosted somewhere and built on some platform and ADD a Hubspot blog and the landing pages/cta's, Hubspot tools on a subdomain hosted by Hubspot. Hubspot has tools in it's website settings for adding google analytics (actually it's just a field to add code to the header area). If a client has universal analytics on their primary domain do I still need to go and add a separate analytics property for the subdomain and go through Hubspot's tools to install it on the subdomain? Or just use the same code from their primary domain and add it to the Hubspot header? What is the best route? Any additional thoughts on this subject are welcome - with so much updating and changing coming from Google (and Hubspot as we implement 3.0 - COS) I'm trying to avoid wasted effort, outdated methods, etc. Thanks!
Reporting & Analytics | | rhgraves651 -
One big site for a loose theme or multiple sites for each specific theme?
Hi, Our company produces content related to one loose overall theme. Within that we write content and sell products on three specific sub-themes. Some of our customers cross over and have an interest in two or all three themes and others are only interested in one. At present we have one site covering reviews of products relating to all three sub-themes... ....and three other sites offering how-to guides and tutorials dedicated to each sub-theme. We do not have a lot of time to commit to SEO and so we are considering merging the content we have on all three subjects into the one site covering them all. Each of these sub-themes could have websites in their own right and my worry is that it will be harder to rank for these subject specific terms if the content is not on a site dedicated to that subject. But of course if they were all together then any links we build will be consolidated into one big site. Does anyone have any experience of this or have any advice on what the best thing to do would be? Thanks for your help!
Reporting & Analytics | | frantan0 -
Why would a website rank lower than weaker site?
Hi, Today I noticed that my website is ranking one place lower than a competitor in Google UK ,despite my site having a stronger domain authority and page authority. Is there a plausible reason for this, i'm slightly confused? Thanks,
Reporting & Analytics | | Benjamin3790 -
Should you get a new Google Analytics account if your site has a new domain after a site redesign/new development?
We recently developed a new site for a client and they have opted to move forward with a domain change. Should we create a new Google Analytics account for the new site?
Reporting & Analytics | | TheOceanAgency0 -
Something strange going on with new client's site...
Please forgive my stupidity if there is something obvious here which I have missed (I keep assuming that must be the case), but any advice on this would be much appreciated. We've just acquired a new client. Despite having a site for plenty of time now they did not previously have analytics with their last company (I know, a crime!). They've been with us for about a month now and we've managed to get them some great rankings already. To be fair, the rankings weren't bad before us either. Anyway. They have multiple position one rankings for well searched terms both locally and nationally. One would assume therefore that a lot of their traffic would come from Google right? Not according to their analytics. In fact, very little of it does... instead, 70% of their average 3,000 visits per month comes from just one referring site. A framed version of their site which is through reachlocal, which itself doesn't rank for any of their terms. I don't get it... The URL of the site is: www.namgrass.co.uk (ignore there being a .com too, that's a portal as they cover other countries). The referring site causing me all this confusion is: http://namgrass.rtrk.co.uk/ (see source code at the bottom for the reachlocal thing). Now I know reach local certainly isn't sending them all that traffic, so why does GA say it is... and what is this reachlocal thing anyway?? I mean, I know what reachlocal is, but what gives here with regards to it? Any ideas, please??
Reporting & Analytics | | SteveOllington0