SEOMoz advice on only buying domain if .com version is available
-
RE: "In order to maximize the direct traffic to a domain, it is advised that webmasters should only buy a domain if the .com version is available. "
http://www.seomoz.org/learn-seo/domain
- I am working for a client who's had a domain live for 5 years or so without a .com version of the domain (just .co.uk) - the domain is also hyphenated (which doesn't look like a great idea).
So, just wondering what research has been done into probs caused by lack of .com domain and by using hyphenated domain. I'm trying to figure out whether it would be worth advising client to switch to a new domain.
Your thoughts would be welcome
-
Thanks guys, much appreciated and very useful. I just found Rand's whiteboard on domains and found it quite useful too - see 3: http://www.seomoz.org/blog/how-to-choose-the-right-domain-name
and this on hyphenated domains: http://www.highposition.com/blog/hyphenated-domains-google/ - but it's hard to know. Might set up some of my own tests.
-
It literally depends upon many things! Like if you client’s target market is within UK then I would recommend you to stick up to .co.uk domain as this way you will be able to get better visibility in Google UK plus visitors who are directly coming to your website will tend to trust you!
In my opinion single hyphen is fine if it fits the brand name as well but if you have a domain available that contain no hyphen and at the same time if you can afford a bit of a dip in traffic then you cna go for the new domain and redirect 301 the older domain to the new one but if you are not ready for the traffic and ranking dip then it won’t be a good idea!
Just my 2 cents!
-
I agree with the guys above, it less to do with seo (if any) and more about human error.
I used to help with a uk gaming website that had a lot of american visitors, and I notice over the years people (the old time) would link to "sitename".com instead of .co.uk, which was held by a domain shark, so lost back links.
But I think this is because of an American audience used to everything being .com
Note: ultimately we bought the .com off the domain shark, I contacted him and originally he wanted $1000s for the domain, I said $300 would be the most I would page for it and said good bye. 2 month later he came back to me and sold it for $300. So if you have a domain shark with the .com play the long game with them.
-
I don't think that is is much of an SEO problem as long as you are targeting business in the UK.
We have lots of high ranking .co.uk sites that are unaffected by the .com alternative. We have American suppliers of products who own the .com addresses and therefore we are not in direct competition.
The only time that it could be a problem is if you are physically competing the the .com version and they sell the same product and are targeting the same keywords as you.
Your potential customers may end up buying from the wrong company.
So in my opinion this is a branding issue rather than a Search Engine ranking issue.
-
It really depends which markets your client is trying to target. If their target market is UK only then the .co.uk is perfectly fine. If the .com is available then it would do no harm to purchase it to save a competitor getting hold of it and outranking for the domain/brand name. You could simply redirect the .com to your .co.uk site.
Alternatively if the target is wider than the UK then it becomes increasingly difficult (though not impossible) to rank with a .co.uk in other countries. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Authority Keeps Dropping & FRED
Hi Moz! I've seen a big drop in Domain Authority 31 > 22 recently. I need a plan of what to sort out first, here are the points I know we need to improve: Page Speed Quality content - guides, blogs, videos Better UX experience to improve page engagement Backlinks - quality earned links & improvement of presence on social media This is our site http://www.key.co.uk/en/key/ I am the only SEO, with a small content team - who only really work on adding new products to the site. Our dev team are in France and we can be restricted by them. But I'm worried & I need a plan of what to tackle first to help improve this. We also saw keywords drop out in March - I'm assuming after Fred, some keywords aren't ones I would worry about, but then some are - for example - http://www.key.co.uk/en/key/dollies-load-movers-door-skates this page ranked at position 6 for Dollies - now dropped out altogether. Any ideas are welcome - help 🙂
Algorithm Updates | | BeckyKey2 -
Is it bad from an SEO perspective that cached AMP pages are hosted on domains other than the original publisher's?
Hello Moz, I am thinking about starting to utilize AMP for some of my website. I've been researching this AMP situation for the better part of a year and I am still unclear on a few things. What I am primarily concerned with in terms of AMP and SEO is whether or not the original publisher gets credit for the traffic to a cached AMP page that is hosted elsewhere. I can see the possible issues with this from an SEO perspective and I am pretty sure I have read about how SEOs are unhappy about this particular aspect of AMP in other places. On the AMP project FAQ page you can find this, but there is very little explanation: "Do publishers receive credit for the traffic from a measurement perspective?
Algorithm Updates | | Brian_Dowd
Yes, an AMP file is the same as the rest of your site – this space is the publisher’s canvas." So, let's say you have an AMP page on your website example.com:
example.com/amp_document.html And a cached copy is served with a URL format similar to this: https://google.com/amp/example.com/amp_document.html Then how does the original publisher get the credit for the traffic? Is it because there is a canonical tag from the AMP version to the original HTML version? Also, while I am at it, how does an AMP page actually get into Google's AMP Cache (or any other cache)? Does Google crawl the original HTML page, find the AMP version and then just decide to cache it from there? Are there any other issues with this that I should be aware of? Thanks0 -
Duplicate pages in language versions, noindex in sitemap and canonical URLs in sitemap?
Hi SEO experts! We are currently in the midst of reducing our amount of duplicate titles in order to optimize our SEO efforts. A lot of the "duplicate titles" come from having several language versions of our site. Therefore, I am wondering: 1. If we start using "" to make Google (and others) aware of alternative language versions of a given site/URL, how big a problem will "duplicate titles" then be across our domains/site versions? 2. Is it a problem that we in our sitemap include (many) URL's to pages that are marked with noindex? 3. Are there any problems with having a sitemap that includes pages that includes canonical URL's to other pages? Thanks in advance!
Algorithm Updates | | TradingFloor.com0 -
Domain Authority Just Wont Budge
I've put off asking this question for a long time because I know what the short answer is, but I've been working the SEO on http://www.photojennette.com for almost a year now pretty constantly and all of the measurements are positives except for authority. Traffic has more than doubled, links sending visits has doubled, external followed links is 500% up, keywords sending is way up, pages within the site have way more links and are more diverse in their own SEO, but no matter what I can't get PA and DA to budge. In fact at one point DA dropped a point or two. (although OSE and Moz Analytics shown that competitors lost a point or two at the same time so I didn't think much of it.) I feel like I had a good grasp on what helps DA but I'm starting to question that. Anyone have any ideas?
Algorithm Updates | | jonnyholt0 -
Any PR Lose? Google Made a Update ! Heavy Traffic, Followed SEOmoz Tips - Dropped to PR4 ?
I followed the rules to minimize the links in the page. Getting Same Traffic to my blog and increased only. But my PR5 to PR4 ? why even 404 Error was reduced o 5 or 6 which i updated now ! not accepting any Text Link ads ! too past 6 months also !
Algorithm Updates | | Esaky0 -
Non .Com or .Co Versus .ca or .fm sites - In terms of SEO value
We are launching a new site with a non traditional top level domain . We were looking at either .ca or .in as we are not able to get the traditional .com or .co or .net etc . I was wondering if this has any SEO effect ? Does Google/Bing treat this domain differently .Will it be penalized ? Note : My site is a US based site targeting US audience
Algorithm Updates | | Chaits0 -
Another Domain ranking instead of my Domain
Hi My Domain name is Replicahause.net, 2 weeks ago my server had an outage for 3 days and my rankings dissappeared in google entirely, however i also noticed that when i typed in my domain name "replicahause" or "replicahause.net" , i would see abhishekyadav.com appearing on #1 in google which does a 301 into Replicahause.net I was able to convince the owner of Abhishekyadav to remove the 301 but my site Replicahause.net's Rankings still does not appear to have come back to google, is there something i'm missing here ? We were ranked #1 to #10 for at least 40 keywords, they've just seemed to dissappeard after the server downtime we had and the 301 from AbhishekYadav.com Thanks
Algorithm Updates | | jansimon0 -
Was Panda applied at sub-domain or root-domain level?
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine? What's the general consensus on whether this was applied at the sub-domain or root-domain level? My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda. I'd love to hear your thoughts and opinions though?
Algorithm Updates | | TomCritchlow1