Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
URLs in Greek, Greeklish or English? What is the best way to get great ranking?
-
Hello all,
I am Greek and I have a quite strange question for you.
Greek characters are generally recognized as special characters and need to have UTF-8 encoding.
The question is about the URLs of Greek websites.
According the advice of Google webmasters blog we should never put the raw greek characters into the URL of a link. We always should use the encoded version if we decide to have Greek characters and encode them or just use latin characters in the URL. Having Greek characters un-encoded could likely cause technical difficulties with some services, e.g. search engines or other url-processing web pages.
To give you an example let's look at
A) http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1which is the URL with the encoded Greek characters and it shows up in the browser asB) http://el.wikipedia.org/wiki/Ελβετία
The problem with A is that everytime we need to copy the URL and paste it somewhere (in an email, in a social bookmark site, social media site etc) the URL appears like the A, plenty of strange characters and %. This link sometimes may cause broken link issues especially when we try to submit it in social networks and social bookmarks.
On the other hand, googlebot reads that url but I am wondering if there is an advantage for the websites who keep the encoded URLs or not (in compairison to the sites who use Greeklish in the URLs)!
So the question is:
For the SEO issues, is it better to use Greek characters (encoded like this one http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1) in the URLs or would it be better to use just Greeklish (for example http://el.wikipedia.org/wiki/Elvetia ?
Thank you very much for your help!
Regards,
Lenia
-
Hi Tom,
I really appreciate your detailed answer.
You give a lot of information here. Thanks.
Taking into account the 3 main points you mention I would go with the Greeklish.
I think I should provide explanations about the term Greeklish. The official language of Greece is modern Greek. Modern Greek has a specific alphabet and it is not the same as the latin alphabet. On the other hand when a Greek person write Greek words by using the latin alphabet, then that is called Greeklish (Greek + english). It is a quick and easy way to write mms, imessages without paying attention on the orthography.
A URL in Greeklish is understandable by people in Greece => it can be considered as localised URL
A URL in Greeklish can easily be shared with no particular technical implications.
On the other hand the wikipedia articles use the encoded Greek Characters in the URL.
Well, I think if the SEO benefit is not that big, I would go with the Greeklish solutions.
I would be glad to have the feedback of other experts about that subject.
Tom thank you very much!
Regards,
Lenia
-
This is a really good question, Lenia. Really, really good, in fact.
Let's break this down into a number of factors:
Having localised URLs (by that I mean URLs, written in the country's language) - From an SEO perspective, I do believe there is some correlation that having localised URLs helps to rank higher, in the same way that having a keyword in the URL may help - having this keyword in the country's language would, by default, work the same way. However, the SEO benefit of doing so isn't that big, I'd see it as only a little boost, so I wouldn't let the SEO side weigh too heavily on your decision.
Now, having localised URLs for a user perspective is something that I think is very useful. I'd see it as a bigger plus for a user than I would for SEO purposes. I think having localised URLs shows the user that you're a part of that country, not just a larger corporation with an international presence but no real interest in the country for example. I think it helps users recognise and anticipate what the URLs would be for their user journey as well. Also, (I don't know how relative this might be for you) but having localised URLs can definitely help for offline campaigns and promotion. Say you were running some newspaper or billboard ads and you wanted to track how many people were then visiting your site as a result, you might want to setup a custom URL or search term for the campaign. So, you're newspaper advert would have "Visit www.domain.com/customURLhere/" on it. This would look infinitely better if it was written in the localised language (although I suppose you could always setup a 301 redirect for the URL).
Ultimately, however, I think you're decision should largely be influenced on the technical implications. The SEO value would be slight, but not that significant whichever method you choose. I would go with whatever solution would be easiest for you technically - it sounds like it would be easier to accommodate user and SEO factors, rather than having to accommodate technical factors for a slight SEO gain.
Just my input on the issue, and so I'd love to hear more from others on it - as I think it's a great question which could do with the input of some of the talented folk here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
What's the best way to handle product filter URLs?
I've been researching and can't find a clear cut answer. Imagine you have a product category page e.g. domain/jeans You've a lot of options as to how to filter the results domain/jeans?=ladies,skinny,pink,10 or domain/jeans/ladies-skinny-pink-10 or domain/jeans/ladies/skinny?=pink,10 And in this how do you handle titles, breadcrumbs etc. Is the a way you prefer to handle filters and why do you do it that way? I'm trying to make my mind up as some very big names handle this differently e.g. http://www.next.co.uk/shop/gender-women-category-jeans/colour-pink-fit-skinny-size-10r VS https://www.matalan.co.uk/womens/shop-by-category/jeans?utf8=✓&[facet_filter][meta.tertiary_category][Skinny]=on&[facet_filter][variants.meta.size][Size+10]=on&[facet_filter][meta.master_colour][Midwash]=on&[facet_filter][min_current_price][gte]=6.0&[facet_filter][min_current_price][lte]=18.0&per=36&sort=
Technical SEO | | RodneyRiley0 -
Category URL Pagination where URLs don't change between pages
Hello, I am working on an e-commerce site where there are categories with multiple pages. In order to avoid pagination issues I was thinking of using rel=next and rel=prev and cannonical tags. I noticed a site where the URL doesn't change between pages, so whether you're on page 1,2, or 3 of the same category, the URL doesn't change. Would this be a cleaner way of dealing with pagination?
Technical SEO | | whiteonlySEO0 -
Best way to noindex long dynamic urls?
I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing. Any assistance will be greatly appreciated : )
Technical SEO | | wcbuckner0 -
SEO-optimized Urls for Japan: English or Japanese Characters
Hi, Anyone got experience with Japanese Urls? I'm currently working on the relaunch of the Japanese site of the troteclaser.com and I wonder if we should use English or Japanese characters for the Urls. I found some topics on the forums about this, but they only tell you that Google can crawl both without problems. The question is if there is a benefit if Japanese characters are used.
Technical SEO | | Troteclaser1 -
Trailing Slashes In Url use Canonical Url or 301 Redirect?
I was thinking of using 301 redirects for trailing slahes to no trailing slashes for my urls. EG: www.url.com/page1/ 301 redirect to www.url.com/page1 Already got a redirect for non-www to www already. Just wondering in my case would it be best to continue using htacces for the trailing slash redirect or just go with Canonical URLs?
Technical SEO | | upick-1623910 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
Is there a great tool for URL mapping old to new web site?
We are implementing new design and removing some pages and adding new content. Task is to correctly map and redirect old pages that no longer exist.
Technical SEO | | KnutDSvendsen0