Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Url with hypen or.co?
-
Given a choice, for your #1 keyword, would you pick a .com with one or two hypens? (chicago-real-estate.com) or a .co with the full name as the url (chicagorealestate.co)?
Is there an accepted best practice regarding hypenated urls and/or decent results regarding the effectiveness of the.co?
Thank you in advance!
-
Hi Joe, this is for sure an awesome question, so many different point of views, the problem I see with .co is this one:
"Sites with country-coded top-level domains (such as .ie) are already associated with a geographic region, in this case Ireland. In this case, you won't be able to specify a geographic location."
Source: http://www.google.com/support/webmasters/bin/answer.py?answer=62399
So if I understand this correctly, and you want to target real estate clients in the Chicago area (which I love and will be there for the U2 concert on July 4th) and over US/worldwide, a .co domain is probably not the way to go here.
There has been a lot of talk about .co (TLD for Colombia), same as .ws, supposedly "WebSite", actually West Samoa, so I would advice to make the obvious, look at your competitors, does anyone has a .co domain and are ranking in Chicago? are any of the top 100 results anything but .com? try different keywords just to check if there are any .co sites ranking in the real estate market.
Hope that helps!
-
Thanks for the feedback. Thats the beauty of SEO. The only way to figure out what is the most effective is to try multiple ways and measure. Then, as soon as you get it and have a conclusion, the rules change...
-
At the risk of getting a bunch of thumbs down, between the choices you have specifically asked, I am going to throw in with the .co.
I think the issue is going to be how you promote the site, where you host it and where you get your links from.
If you host it in the USA and build a solid local link building campaign no one is going to have any trouble figuring out where you should be relevant. least of all the major search engines.
The other concern would be when someone tries to type in your url directly. However, There will be a tendency to automatically add an "m" to the end. But will that be any more of a problem then trying to get people to put a hyphen in the right place?
If people really find your site helpful, they'll just bookmark it in my experience.
-
Trust me when I say that I didn't think of the .co because of the Super Bowl ad.
I have heard mixed results on the .co but really haven't seen it in search results but I dont see to many hyphenated urls either. Maybe I will just add a word to the .com?
-
They had an ad in the superbowl, I've heard from 5 different clients about if they should buy the .co after that.
-
This link might help as well...
-
Completely disagree with you Korgo the average user doesn't even know there is a .co TLD that exists.
They have been available for a while, I spend a lot of time online through work and play and have never seen a site using one so not sure why you think they will take off if they haven't already despite virtually ever domain seller pushing them heavily last year.
-
I agree with James and would aim for one hyphen on the .com TLD. I did some unscientific user testing in this area and one hyphen was fine, 2 or more was a turn off for the user.
The same users expected a site to be .co.uk (I'm in the UK) or .com and some were confused by the existence of different TLD's wondering where the .co.uk or .com was and thinking the URL might not work without them.
-
I would pick hypenated over anything but .com. I would nt even use .net - .org is the only one I would consider for a true non-profit organisation.
I have some hyphenated domains for ecommerce websites, and have found no big problem with them personally. Of course go with non-hyphenated .com's if you can!
-
I don't like hyphens, but I don't like foreign domain extensions even more (Columbia!) despite what they say about it meaning "company", no, no. They pulled the same stunt with .me it's not on.
It depends how competitive the niche is and how much you want it. I have a feeling EMD won't be as strong in the coming months for long tail searches like this, but for now I guess it will give you the edge, what I'm trying to say is if you don't like the domain don't go with it, follow what you feel is most logical, as that is probably best for long term SEO success.The EMD benefit is nowhere near the same (in my exp) with hyphenated or foreign domains, don't get me wrong they are a benefit, but a .com, .org or net will always outrank (for now).
So in response to your question, If I was you I would buy them both (so comp. can't steal em' later), make them both blogs and get a nice brand-able domain for your business, use the two blogs as feeders for your business.
-
Thanks for your reply.
-
Thanks! I figured two hyphens wouldn't be a good idea but it's sure tempting.
-
According to the book The Art of SEO, my personal SEO bible, if you're not concerned with type-in-traffic, branding or name recognition, you don't need to worry about this. However to build a successful website long term you need to own the .com address and if you then want to use .co then the .com should redirect to it. According to the book, with the exception of the geeky, most people who use the web still assume that .com is all that's available or these are the domains that are most trustworthy. So don't lose traffic by having another address!
-
Hi Joe,
I wont go after 2 hyphens, usually if the .com is not available i go after a .net.
But in your case, i would go with a .co
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inactive Products - Inactive URLs
Hi, In our website www.viatrading.com we have many products that might be in stock or not depending on availability. Until now, when a product was not available anymore, we took this page down (and redirected to its product category page). And, only if the product was available again, we re-activated the URL - this might be days, months or even years later. To make this more SEO-friendly, we decided now that while a product is not available, instead or deactivating/redirecting the page, we will leave it online and just add a message saying "This product is currently not available". If we do this, we will automatically re-activate about 500 products pages at once. 1. Just to make sure, is it harmful for SEO to keep activating/deactivating URLs this way? 2. Since most of these pages have been deindexed for a long time due to being redirected - have they lost all their SEO juice? 3. How can we better activate these old 500 pages - is it ok activating them all at once? Thank you,
Intermediate & Advanced SEO | | viatrading11 -
Removing .html from URLs - impact of rankings?
Good evening Mozzers. Couple of questions which I hope you can help with. Here's the first. I am wondering, are we likely to see ranking changes if we remove the .html from the sites URLs. For example website.com/category/sub-category.html Change to: website.com/category/sub-category/ We will of course make sure we 301 redirect to the new, user friendly URLs, but I am wondering if anyone has had previous experience of implementing this change and how it has effected rankings. By having the .html in the URLs, does this stop link juice being flowed back to the root category? Second question: If one page can be loaded with and without a forward slash "/" at the end, is this a duplicate page, or would Google consider this as the same page? Would like to eliminate duplicate content issues if this is the case. For example: website.com/category/ and website.com/category Duplicate content/pages?
Intermediate & Advanced SEO | | Jseddon920 -
Sitemap generator which only includes canonical urls
Does anyone know of a 3rd party sitemap generator that will only include the canonical url's? Creating a sitemap with geo and sorting based parameters isn't the most ideal way to generate sitemaps. Please let me know if anyone has any ideas. Mind you we have hundreds of thousands of indexed url's and this can't be done with a simple text editor.
Intermediate & Advanced SEO | | recbrands0 -
Duplicate URLs ending with #!
Hi guys, Does anyone know why a site can contain duplicate URLs ending with hastag & exclamation mark e.g. https://site.com.au/#! We are finding a lot of these URLs (as duplicates) and i was wondering what they are from developer standpoint? And do you think it's worth the time and effort adding a rel canonical tag or 301 to these URLs eventhough they're not getting indexed by Google? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Wildcarding Robots.txt for Particular Word in URL
Hey All, So I know that this isn't a standard robots.txt, I'm aware of how to block or wildcard certain folders but I'm wondering whether it's possible to block all URL's with a certain word in it? We have a client that was hacked a year ago and now they want us to help remove some of the pages that were being autogenerated with the word "viagra" in it. I saw this article and tried implementing it https://builtvisible.com/wildcards-in-robots-txt/ and it seems that I've been able to remove some of the URL's (although I can't confirm yet until I do a full pull of the SERPs on the domain). However, when I test certain URL's inside of WMT it still says that they are allowed which makes me think that it's not working fully or working at all. In this case these are the lines I've added to the robots.txt Disallow: /*&viagra Disallow: /*&Viagra I know I have the solution of individually requesting URL's to be removed from the index but I want to see if anybody has every had success with wildcarding URL's with a certain word in their robots.txt? The individual URL route could be very tedious. Thanks! Jon
Intermediate & Advanced SEO | | EvansHunt0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Overly-Dynamic URL
Hi, We have over 5000 pages showing under Overly-Dynamic URL error Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like, http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3 Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there? like: Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y if not how far into the url should be disallowed? So far we have added the following to our robots,txt Disallow: /?sort=title Disallow: /?use_selected_filter=Y Disallow: /?sort=price Disallow: /?clearall=Y Just not sure if they are correct. Any help would be greatly appreciated. Thank you,Kami
Intermediate & Advanced SEO | | dellamoda2 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0