URL SEO
-
Hi All
I am completely new to SEO and I have a question about URL's which I would like advise on.
We are about to launch an immigration consultancy website which caters for several countries.
For the example below we are targeting the keyword "UK Visit Visa", which URL would be better from an SEO prospective?
1. www.example.com/uk/visit-visa
2. www.example.com/uk/uk-visit-visaThanks,
Fuad
-
Hi Fuad
You're welcome, am glad to have been able to help.
All the best for your website and business,
Kind regards
Simon
-
Thanks to everyone that took the time to answer my question really appreciate your contributions.
Best wishes,
Fuad
-
Hi Simon,
Thanks for your comprehensive answer really, really helpful. We are planning to use sub-folders for different countries but we will now combine this with your suggestion of putting really important pages on the root.
I have already read 'The Beginner's Guide to SEO and it was fantastic.
I am sure I will be bothering you with more SEO no brainers in the near future.
Kind regards,
Fuad
-
John and Casey have given good advice,
if you are trying to rank in different countries on the same website.You should read this page, New markup for multilingual content
http://googlewebmastercentral.blogspot.com/2011/12/new-markup-for-multilingual-content.html
-
Hi Fuad
A good question.
The answer kind of depends upon your intended structure for your website going forwards. It's usually advisable to keep pages at as high a level as possible, as in having a fairly flat hierarchy.
If you are to have a sub-folder for each country, then from the above, my answer would be:
because you won't need 'uk' to be there twice in the URL.
If these were deemed to be really important pages for your visitors, then you could even consider:
so that these pages are at the highest possible level. Though this depends on how you are structuring the rest of your website, could be that it's best for the User Experience & Navigation that all UK pages fall within a UK sub-folder.
Also consider whether or not you need the www. as that is a subdomain, could go for **example.com/uk/visit-visa **
As you are new to SEO, check out 'The Beginner's Guide to SEO' here on SEOmoz. Chapter 4 has some coverage of URL Structures.
I hope that helps,
Regards
Simon
-
You might want to be careful. Duplicate content issues will arise if the same content is on the same page for different countries.
I agree with Casey www.example.com/uk/visit-visa will be just fine.
-
Hi Faud,
There is no reason to stuff uk in your URL twice, search engine will see that /uk/ just fine. As this is just a small part of the algorithm anyways either way is just fine for SEO, so use the one whats better for the user.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are titles on images still important for SEO?
We're doing research on image optimization and wanted to ask the MOZ community if you think having titles on images are still important for SEO if you have descriptive ALT text.
Algorithm Updates | | EvolveCreative0 -
Case Sensitive URL Redirects for SEO
We want to use a 301 redirect rule to redirect all pages to a lower case url format. A 301 passes along most of the link juice... most. Will we even see a negative impact in PageRank/SERPS when we redirect every single page on our site?
Algorithm Updates | | tcanders0 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Increased 404 and Blocked URL Notifications in Webmaster Tools
In the last 45 days, I am receiving an increasing number of 404 alerts in Google Webmaster Tools. When I audit the notifications, they are not "new" broken links, these are all links that have been pointing to non-existent pages for years that for some reason Google is just notifying me about them. This has also coincided with about a 30% drop in organic traffic from late April to early May. The site is www.petersons.com and its been around for a while and the site attracts a fair amount of natural links so in the 2 years I've managed the campaign I've done very little link-building. I'm in the process of setting up redirects for these urls but why is Google now notifying me of years old broken links and could that be one of the reasons for my drop in traffic. My second issue is my I am being notified that I am blocking over 8,000 urls in my Robots file when I am not. I attached a screenshot. Here is a link to a screenshot. http://i.imgur.com/ncoERgV.jpg
Algorithm Updates | | CUnet0 -
SEO Strategy Audit
Greetings Mozzers. I would really appreciate some input / advice / a sanity check on our existing SEO strategy. A year into implementing the strategy (outlined below) there appears to be no uplift in our existing traffic volume. I guess i'm wondering. Am I on the right track? Is there something I'm perhaps additional should be doing / i'm doing something wrong? Am I suffering from Penguin? The strategy: We produce a range of content, this is written 100% for our audience, How-to guides, infographics, industry news etc. - this has lead to getting a wide range of publications with high powered relevant domains, widely shared and read. As a result we now have a series of ongoing columns where we regularly produce articles that our widely read and shared. We have a wide variety of homepage and deep links from quality sources. We have our own blog producing a wide range of content relevant to our audience. Industry trends, news, changes that affect our readers, guides. We optimise our pages using the Moz Page Grader for select keywords. We've structured our site architecture appropriately for the most important pages. Additional info: We received a WMT penalty last year in march - unnatural links. We engaged an SEO agency who simply went rogue, automated link building with spun articles. We fired the agency, asked them to remove the links they had "created" and set about a resubmission. Between March and May when we made our resubmission traffic dropped 40%. Now we did have he resubmission accepted after several reconsiderations but whilst we've seen a slight recovery we have plateaued for almost a year. This penalty obviously straddles the time penguin arrived, so I wonder if perhaps we haven't seen a recovery as we are still affected by penguin. Despite all the great work we have done. If anyone has any advice or insight, please do get in touch.
Algorithm Updates | | RobertChapman0 -
Canonical URl
Hello, All the pages of my site contained canonical url it shows me in the source, but on seomoz site it shows error that some the pages not containing canonical urls, anyone will help me ??
Algorithm Updates | | KLLC0 -
Why does Google say they have more URLs indexed for my site than they really do?
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site. I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down. At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping! Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed? Thanks!
Algorithm Updates | | CassisGroup0