Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
URLs in Greek, Greeklish or English? What is the best way to get great ranking?
-
Hello all,
I am Greek and I have a quite strange question for you.
Greek characters are generally recognized as special characters and need to have UTF-8 encoding.
The question is about the URLs of Greek websites.
According the advice of Google webmasters blog we should never put the raw greek characters into the URL of a link. We always should use the encoded version if we decide to have Greek characters and encode them or just use latin characters in the URL. Having Greek characters un-encoded could likely cause technical difficulties with some services, e.g. search engines or other url-processing web pages.
To give you an example let's look at
A) http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1which is the URL with the encoded Greek characters and it shows up in the browser asB) http://el.wikipedia.org/wiki/Ελβετία
The problem with A is that everytime we need to copy the URL and paste it somewhere (in an email, in a social bookmark site, social media site etc) the URL appears like the A, plenty of strange characters and %. This link sometimes may cause broken link issues especially when we try to submit it in social networks and social bookmarks.
On the other hand, googlebot reads that url but I am wondering if there is an advantage for the websites who keep the encoded URLs or not (in compairison to the sites who use Greeklish in the URLs)!
So the question is:
For the SEO issues, is it better to use Greek characters (encoded like this one http://el.wikipedia.org/wiki/%CE%95%CE%BB%CE%B2%CE%B5%CF%84%CE%AF%CE%B1) in the URLs or would it be better to use just Greeklish (for example http://el.wikipedia.org/wiki/Elvetia ?
Thank you very much for your help!
Regards,
Lenia
-
Hi Tom,
I really appreciate your detailed answer.
You give a lot of information here. Thanks.
Taking into account the 3 main points you mention I would go with the Greeklish.
I think I should provide explanations about the term Greeklish. The official language of Greece is modern Greek. Modern Greek has a specific alphabet and it is not the same as the latin alphabet. On the other hand when a Greek person write Greek words by using the latin alphabet, then that is called Greeklish (Greek + english). It is a quick and easy way to write mms, imessages without paying attention on the orthography.
A URL in Greeklish is understandable by people in Greece => it can be considered as localised URL
A URL in Greeklish can easily be shared with no particular technical implications.
On the other hand the wikipedia articles use the encoded Greek Characters in the URL.
Well, I think if the SEO benefit is not that big, I would go with the Greeklish solutions.
I would be glad to have the feedback of other experts about that subject.
Tom thank you very much!
Regards,
Lenia
-
This is a really good question, Lenia. Really, really good, in fact.
Let's break this down into a number of factors:
Having localised URLs (by that I mean URLs, written in the country's language) - From an SEO perspective, I do believe there is some correlation that having localised URLs helps to rank higher, in the same way that having a keyword in the URL may help - having this keyword in the country's language would, by default, work the same way. However, the SEO benefit of doing so isn't that big, I'd see it as only a little boost, so I wouldn't let the SEO side weigh too heavily on your decision.
Now, having localised URLs for a user perspective is something that I think is very useful. I'd see it as a bigger plus for a user than I would for SEO purposes. I think having localised URLs shows the user that you're a part of that country, not just a larger corporation with an international presence but no real interest in the country for example. I think it helps users recognise and anticipate what the URLs would be for their user journey as well. Also, (I don't know how relative this might be for you) but having localised URLs can definitely help for offline campaigns and promotion. Say you were running some newspaper or billboard ads and you wanted to track how many people were then visiting your site as a result, you might want to setup a custom URL or search term for the campaign. So, you're newspaper advert would have "Visit www.domain.com/customURLhere/" on it. This would look infinitely better if it was written in the localised language (although I suppose you could always setup a 301 redirect for the URL).
Ultimately, however, I think you're decision should largely be influenced on the technical implications. The SEO value would be slight, but not that significant whichever method you choose. I would go with whatever solution would be easiest for you technically - it sounds like it would be easier to accommodate user and SEO factors, rather than having to accommodate technical factors for a slight SEO gain.
Just my input on the issue, and so I'd love to hear more from others on it - as I think it's a great question which could do with the input of some of the talented folk here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to link to multiple location pages
I am a Magician and have multiple location pages for each county I cover. I currently have them linked off the menu under locations/ <county>and also in the footer</county> However I have heard that a link from the page is much stronger, so I am experimenting with removing the Menu & Footer link and just linking to these pages from within the content. It's not really a navigation item and most people come in through search to the right page. Am I diluting the link by having it in the Menu/Page and Footer? I read a long time ago that Google only considers the first link to a page and ignores the rest - is that the case? Thanks Roger https://www.rogerlapin.co.uk/
Technical SEO | | Rogerperk0 -
Do URLs with canonical tags get indexed by Google?
Hi, we re-branded and launched a new website in February 2016. In June we saw a steep drop in the number of URLs indexed, and there have continued to be smaller dips since. We started an account with Moz and found several thousand high priority crawl errors for duplicate pages and have since fixed those with canonical tags. However, we are still seeing the number of URLs indexed drop. Do URLs with canonical tags get indexed by Google? I can't seem to find a definitive answer on this. A good portion of our URLs have canonical tags because they are just events with different dates, but otherwise the content of the page is the same.
Technical SEO | | zasite0 -
Best way to noindex long dynamic urls?
I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing. Any assistance will be greatly appreciated : )
Technical SEO | | wcbuckner0 -
What is the best way to refresh a webpage of a news site, SEO wise?
Hello all, we have a client which is a sports website. In fact it is a veyr big website and has a huge number of news per day. This is mostly the reason why it refreshes some of its pages with news list every 420 seconds. We currently use meta refresh. I have read here and elsewhere that meta refreshes should be avoided. But we don't do it to send to another page and pass any kind of page authority / juice. Is in this case javascript refresh better? Is there any other better way. What do you think & suggest? Thank you!
Technical SEO | | pkontopoulos0 -
What is the best way to deal with an event calendar
I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions. Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future. I thought of having the calendar no followed at all but the content for the classes seems valuable. Thanks,
Technical SEO | | categorycode0 -
Old URL redirect to New URL
Alright I did something dumb a year a go and I'm still paying for it. I changed my hyphenated URL to the non-hyphenated version when I redesigned my website. I say it was dumb because I lost most of my link juice even though I did 301 redirects (via the htaccess file) for almost all of the pages I could find in Google's index. Here's my problem. My new site took a huge hit in traffic (down 60%) when I made the change and even though I've done thousands of redirects my old site is still showing up in the SERPS and send much if not most of my traffic. I don't want to take the old site down in fear it will kill all of my traffic. What should I do? Is there a better method I should explore then 301 redirects? Could the other site be affecting my current rank since it's still there? (FYI...both sites are built on the WP platform). Any help or ideas are greatly appreciated. Thank you! Joe
Technical SEO | | kaje0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
How best to redirect URL from expired classified ads?
We have problem because our content are classifieds. Every ad expired after one or two mounts and then ad becomes inactive and we keep his page for one mount latter like a same page but we ad a notice that ad is inactive. After that we delete the ad and his page but need to redirect that URL to search results page which contains similar ads because we don't want to lose the traffic form that pages. How is the best way to redirect ad URL? Our thinking was to redirect internal without 301 redirection because the httacces file will be very big after a while and we are thinking to try a canonicalization because we don't want engine to think that we have to much duplicate content.
Technical SEO | | Donaab0