Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Url with hypen or.co?
-
Given a choice, for your #1 keyword, would you pick a .com with one or two hypens? (chicago-real-estate.com) or a .co with the full name as the url (chicagorealestate.co)?
Is there an accepted best practice regarding hypenated urls and/or decent results regarding the effectiveness of the.co?
Thank you in advance!
-
Hi Joe, this is for sure an awesome question, so many different point of views, the problem I see with .co is this one:
"Sites with country-coded top-level domains (such as .ie) are already associated with a geographic region, in this case Ireland. In this case, you won't be able to specify a geographic location."
Source: http://www.google.com/support/webmasters/bin/answer.py?answer=62399
So if I understand this correctly, and you want to target real estate clients in the Chicago area (which I love and will be there for the U2 concert on July 4th) and over US/worldwide, a .co domain is probably not the way to go here.
There has been a lot of talk about .co (TLD for Colombia), same as .ws, supposedly "WebSite", actually West Samoa, so I would advice to make the obvious, look at your competitors, does anyone has a .co domain and are ranking in Chicago? are any of the top 100 results anything but .com? try different keywords just to check if there are any .co sites ranking in the real estate market.
Hope that helps!
-
Thanks for the feedback. Thats the beauty of SEO. The only way to figure out what is the most effective is to try multiple ways and measure. Then, as soon as you get it and have a conclusion, the rules change...
-
At the risk of getting a bunch of thumbs down, between the choices you have specifically asked, I am going to throw in with the .co.
I think the issue is going to be how you promote the site, where you host it and where you get your links from.
If you host it in the USA and build a solid local link building campaign no one is going to have any trouble figuring out where you should be relevant. least of all the major search engines.
The other concern would be when someone tries to type in your url directly. However, There will be a tendency to automatically add an "m" to the end. But will that be any more of a problem then trying to get people to put a hyphen in the right place?
If people really find your site helpful, they'll just bookmark it in my experience.
-
Trust me when I say that I didn't think of the .co because of the Super Bowl ad. I have heard mixed results on the .co but really haven't seen it in search results but I dont see to many hyphenated urls either. Maybe I will just add a word to the .com?
-
They had an ad in the superbowl, I've heard from 5 different clients about if they should buy the .co after that.
-
This link might help as well...
-
Completely disagree with you Korgo the average user doesn't even know there is a .co TLD that exists.
They have been available for a while, I spend a lot of time online through work and play and have never seen a site using one so not sure why you think they will take off if they haven't already despite virtually ever domain seller pushing them heavily last year.
-
I agree with James and would aim for one hyphen on the .com TLD. I did some unscientific user testing in this area and one hyphen was fine, 2 or more was a turn off for the user.
The same users expected a site to be .co.uk (I'm in the UK) or .com and some were confused by the existence of different TLD's wondering where the .co.uk or .com was and thinking the URL might not work without them.
-
I would pick hypenated over anything but .com. I would nt even use .net - .org is the only one I would consider for a true non-profit organisation.
I have some hyphenated domains for ecommerce websites, and have found no big problem with them personally. Of course go with non-hyphenated .com's if you can!
-
I don't like hyphens, but I don't like foreign domain extensions even more (Columbia!) despite what they say about it meaning "company", no, no. They pulled the same stunt with .me it's not on.
It depends how competitive the niche is and how much you want it. I have a feeling EMD won't be as strong in the coming months for long tail searches like this, but for now I guess it will give you the edge, what I'm trying to say is if you don't like the domain don't go with it, follow what you feel is most logical, as that is probably best for long term SEO success.The EMD benefit is nowhere near the same (in my exp) with hyphenated or foreign domains, don't get me wrong they are a benefit, but a .com, .org or net will always outrank (for now).
So in response to your question, If I was you I would buy them both (so comp. can't steal em' later), make them both blogs and get a nice brand-able domain for your business, use the two blogs as feeders for your business.
-
Thanks for your reply.
-
Thanks! I figured two hyphens wouldn't be a good idea but it's sure tempting.
-
According to the book The Art of SEO, my personal SEO bible, if you're not concerned with type-in-traffic, branding or name recognition, you don't need to worry about this. However to build a successful website long term you need to own the .com address and if you then want to use .co then the .com should redirect to it. According to the book, with the exception of the geeky, most people who use the web still assume that .com is all that's available or these are the domains that are most trustworthy. So don't lose traffic by having another address!
-
Hi Joe,
I wont go after 2 hyphens, usually if the .com is not available i go after a .net.
But in your case, i would go with a .co
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Does a non-canonical URL pass link juice?
Our site received a great link from URL A, which was syndicated to URL B. But URL B is canonicalized to URL A. Does the link on URL B pass juice to my site? (See image below for a visual representation of my question) zgbzqBy
Intermediate & Advanced SEO | | Choice1 -
Are these URL hashtags an SEO issue?
Hi guys - I'm looking at a website which uses hashtags to reveal the relevant content So there's page intro text which stays the same... then you can click a button and the text below that changes So this is www.blablabla.com/packages is the main page - and www.blablabla.com/packages#firstpackage reveals first package text on this page - www.blablabla.com/packages#secondpackage reveals second package text on this same page - and so on. What's the best way to deal with this? My understanding is the URLs after # will not be indexed very easily/atall by Google - what is best practice in this situation?
Intermediate & Advanced SEO | | McTaggart0 -
URL mapping for site migration
Hi all! I'm currently working on a migration for a large e-commerce site. The old one has around 2.5k urls, the new one 7.5k. I now need to sort out the redirects from one to the other. This is proving pretty tricky, as the URL structure has changed site wide. There doesn't seem to be any consistent rules either so using regex doesn't really work. By and large, the copy appears to be the same though. Does anybody know of a tool I can crawl the sites with that will export the crawled url and related copy into a spreadsheet? That way I can crawl both sites and compare the copy to match them up. Thanks!
Intermediate & Advanced SEO | | Blink-SEO0 -
Product URL structure for a marketplace model
Hello All. I run an online marketplace start-up that has around 10000 products listed from around 1000+ sellers. We are a similar model to etsy/ebay in the sense that we provide a platform but sellers to list products and sell them. I have a URL structure question. I have read http://www.seomoz.org/q/how-to-define-best-url-structure-for-product-pages which seems to show everyone suggests to use Products: products/category/product-name Categories: products/category as the structure for product pages. Because we are a marketplace (our category structure has multiple tiers sometimes up to 3) our sellers choose a category for products to go in. How we have handled this before is we have used: Products: products/last-tier-category-chosen/product-name (eg: /products/sweets-and-snacks/fluffy-marshmallows) Categories: products/category (eg: /products/sweets-and-snacks) However we have two issues with this: The categories can sometimes change, or users can change them which means the links completely change and undo any link building work built up. The urls can get a bit long and am worried that the most important data (the fluffy marshmallow that reflects in the page title and content) is left till too late in the URL. As a result we plan to change our URL structure (we are going through a rebuild anyhow so losing old links is not an issue here) so that the new structure was: Products: products/product-name(eg: /products/fluffy-marshmallows) Categories: products/category (eg: /products/sweets-and-snacks) My concern about doing this however, and question here, is whether this willnegatively impact the "structure" of pages when google crawls our marketplace.Because "fluffy marshmallows" will no longer technically fit into the url structure of "sweets and snacks". I dont know if this would have a negative impact or not. FYI etsy (one of the largest marketplace models in the world) us the latter approach and do not have categories in product urls, eg: listing/42003836/vintage-french-industrial-inspired-side Any ideas on this? Many thanks!
Intermediate & Advanced SEO | | LiamPatterson0 -
Submitting URLs multiple times in different sitemaps
We have a very dynamic site, with a large number of pages. We use a sitemap index file, that points to several smaller sitemap files. The question is: Would there be any issue if we include the same URL in multiple sitemap files? Scenario: URL1 appears on sitemap1. 2 weeks later, the page at URL1 changes and we'd like to update it on a sitemap. Would it be acceptable to add URL1 as an entry in sitemap2? Would there be any issues with the same URL appearing multiple times? Thanks.
Intermediate & Advanced SEO | | msquare0 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0