URL SEO
-
Hi All
I am completely new to SEO and I have a question about URL's which I would like advise on.
We are about to launch an immigration consultancy website which caters for several countries.
For the example below we are targeting the keyword "UK Visit Visa", which URL would be better from an SEO prospective?
1. www.example.com/uk/visit-visa
2. www.example.com/uk/uk-visit-visaThanks,
Fuad
-
Hi Fuad
You're welcome, am glad to have been able to help.
All the best for your website and business,
Kind regards
Simon
-
Thanks to everyone that took the time to answer my question really appreciate your contributions.
Best wishes,
Fuad
-
Hi Simon,
Thanks for your comprehensive answer really, really helpful. We are planning to use sub-folders for different countries but we will now combine this with your suggestion of putting really important pages on the root.
I have already read 'The Beginner's Guide to SEO and it was fantastic.
I am sure I will be bothering you with more SEO no brainers in the near future.
Kind regards,
Fuad
-
John and Casey have given good advice,
if you are trying to rank in different countries on the same website.You should read this page, New markup for multilingual content
http://googlewebmastercentral.blogspot.com/2011/12/new-markup-for-multilingual-content.html
-
Hi Fuad
A good question.
The answer kind of depends upon your intended structure for your website going forwards. It's usually advisable to keep pages at as high a level as possible, as in having a fairly flat hierarchy.
If you are to have a sub-folder for each country, then from the above, my answer would be:
because you won't need 'uk' to be there twice in the URL.
If these were deemed to be really important pages for your visitors, then you could even consider:
so that these pages are at the highest possible level. Though this depends on how you are structuring the rest of your website, could be that it's best for the User Experience & Navigation that all UK pages fall within a UK sub-folder.
Also consider whether or not you need the www. as that is a subdomain, could go for **example.com/uk/visit-visa **
As you are new to SEO, check out 'The Beginner's Guide to SEO' here on SEOmoz. Chapter 4 has some coverage of URL Structures.
I hope that helps,
Regards
Simon
-
You might want to be careful. Duplicate content issues will arise if the same content is on the same page for different countries.
I agree with Casey www.example.com/uk/visit-visa will be just fine.
-
Hi Faud,
There is no reason to stuff uk in your URL twice, search engine will see that /uk/ just fine. As this is just a small part of the algorithm anyways either way is just fine for SEO, so use the one whats better for the user.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How should the Heading Tags be used in Blogs to gain the Best results in SEO?
There are various Heading Tags from H1 to H6. In what order and priority should they be used in order to get best reach and ranking in google. Is every Tag a must in a blog?
Algorithm Updates | | sne79790 -
38% of SEOs Never Disavow Links: Are you one among them or the other 62%?
Hi all, Links disavowing is such a advanced tasks in SEO with decent amount of risk involved. I thought many wouldn't follow use this method as Google been saying that they try to ignore bad links and there will be no penalty for such bad links and negative SEO is really a rare case. But I wondered to see only 38% SEOs never used this method and other 62% are disavowing links monthly, quarterly or yearly. I just wonder do we need to disavow links now? It's very easy to say to disavow a link which is not good but difficult to conclude them whether they are hurting already or we will get hurt once they been disavowed. Thanks Screenshot_3.jpg
Algorithm Updates | | vtmoz1 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Shortened URLs ??
Anyone have any insight into how shortened URLs affect SEO? I use Bitly occasionally for shortened links and was curious if this matters for any reason at all?? I basically use it so I can fit the links in places where long URLs look absurd...mostly social media platforms. I know there's some debate over whether the domain name affects ranking or not. Frankly, that all just goes over my head. Any thoughts welcomed!
Algorithm Updates | | adamxj20 -
Is having an identical title, h1 and url considered "over optimization"? Is it better to vary?
To get some new pages out without over-thinking things, I decided to line up the title tag, h1 tag and URLs of my pages exactly. They are dynamically generated based on the content the user is viewing (internal search results pages) They're not ranking very well at the moment, but there are a number of factors that are likely to blame. But, in particular, does anyone know if varying the text in these elements tends to perform better vs. having them all identical? Has there been any information from Google about this? Most if not all of the "over optimization" content I have seen online pertains to backlinks, not on-page content. It's easy to say, "test it!" And of course, that's just what I'm planning to do. But I thought I would leverage the combined knowledge of this forum to see what information I could obtain first, so I can do some informed testing, as tests can take a while to see results. Thanks 🙂
Algorithm Updates | | ntcma0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
Localised Hosting is Good for SEO - But How Local?
Hi SEOmoz community, A UK based client will soon be opening an office in the USA. We have advised them to create a new website specifically aimed at the US market, primarily because the way you talk to your potential customers is slightly different than here in the UK. However, this has also raised the question of hosting. Of course we'll be advising them to host their new US site in the States, however does it matter where? For example, if their office is in NYC, would it matter if their hosting was based in Dallas? I.e. does Google rank sites hosted in a US city / state higher for localised searches? Interested to hear your thoughts - thanks for your time! Mark
Algorithm Updates | | RiceMedia0 -
Local SEO url format & structure: ".com/albany-tummy-tuck" vs ".com/tummy-tuck" vs ".com/procedures/tummy-tuck-albany-ny" etc."
We have a relatively new site (re: August '10) for a plastic surgeon who opened his own solo practice after 25+ years with a large group. Our current url structure goes 3 folders deep to arrive at our tummy tuck procedure landing page. The site architecture is solid and each plastic surgery procedure page (e.g. rhinoplasty, liposuction, facelift, etc.) is no more than a couple clicks away. So far, so good - but given all that is known about local seo (which is a very different beast than national seo) quite a bit of on-page/architecture work can still be done to further improve our local rank. So here a a couple big questions facing us at present: First, regarding format, is it a given that using geo keywords within the url indispustibly and dramatically impacts a site's local rank for the better (e.g. the #2 result for "tummy tuck" and its SHENANIGANS level use of "NYC", "Manhattan", "newyorkcity" etc.)? Assuming that it is, would we be better off updating our cosmetic procedure landing page urls to "/albany-tummy-tuck" or "/albany-ny-tummy-tuck" or "/tummy-tuck-albany" etc.? Second, regarding structure, would we be better off locating every procedure page within the root directory (re: "/rhinoplasty-albany-ny/") or within each procedure's proper parent category (re: "/facial-rejuvenation/rhinoplasty-albany-ny/")? From what I've read within the SEOmoz Q&A, adding that parent category (e.g. "/breast-enhancement/breast-lift") is better than having every link in the root (i.e. completely flat). Third, how long before google updates their algorithm so that geo-optimized urls like http://www.kolkermd.com/newyorkplasticsurgeon/tummytucknewyorkcity.htm don't beat other sites who do not optimize so aggressively or local? Fourth, assuming that each cosmetic procedure page will eventually have strong link profiles (via diligent, long term link building efforts), is it possible that geo-targeted urls will negatively impact our ability to rank for regional or less geo-specific searches? Thanks!
Algorithm Updates | | WDeLuca0