What is the proper URL length? in seo
-
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google.
but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me?
my competitors have 8 characters domain url and keywords length of 13
and my site has 15 character domain url and keywords length of 13
which one will be prefered by google.
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
In terms of SEO (Search Engine Optimization), while there's no strict rule for the optimal URL length, it's generally recommended to keep URLs concise, descriptive, and user-friendly. Here are some guidelines and considerations:
-
Short and Descriptive:
- Aim for short and descriptive URLs that give users and search engines a clear idea of the page's content. Avoid unnecessary parameters or overly complex structures.
-
Keywords:
- Include relevant keywords in the URL, especially in the domain and the path. This can help search engines understand the topic of the page.
-
Readability:
- Keep URLs readable by using hyphens to separate words instead of underscores. For example, use "example.com/important-page" instead of "example.com/important_page."
-
Avoid Dynamic Parameters:
- If possible, avoid using dynamic parameters in URLs (e.g., "example.com/page?id=123"). Static, keyword-rich URLs are generally more SEO-friendly.
-
Consistency:
- Maintain consistency in your URL structure across your website. This helps both users and search engines navigate and understand the organization of your content.
-
301 Redirects for Changes:
- If you need to change a URL, use 301 redirects to inform search engines that the content has permanently moved. This preserves SEO value.
-
Limit Length:
- While there's no strict character limit for URLs, it's advisable to keep them reasonably short, ideally under 100 characters. Shorter URLs are easier to remember and share.
-
HTTPS:
- Use HTTPS for secure connections. Search engines tend to favor secure websites, and HTTPS is considered a ranking factor.
Remember that the primary goal is to create URLs that are user-friendly and provide a clear indication of the content. Search engines use URLs to understand the context and relevance of a page, so optimizing them for readability and keywords can positively impact your SEO efforts. Additionally, creating a logical URL structure helps users navigate your site more easily.
-
-
The ideal URL length for SEO is typically under 60 characters. Shorter URLs are easier for search engines to crawl and for users to read and remember. Keeping URLs concise, relevant to the page content, and including keywords can positively impact SEO performance. Avoid lengthy URLs with unnecessary parameters or characters.
-
The appropriate page URL is 75 characters length. And the maximum length of URL in the address bar is 2049 characters. For more info. like this click here.
-
In SEO, there is no strict rule for an ideal URL length, but it's generally recommended to keep URLs concise, relevant, and user-friendly. Here are some guidelines to consider:
Short and Descriptive: Aim for short and descriptive URLs that give users and search engines a clear idea of the page's content. A concise URL is easier to remember and share.
Include Keywords: If possible, include relevant keywords in your URL. This can contribute to the page's SEO, but don't over-optimize by stuffing too many keywords.
Avoid Dynamic Parameters: Clean, static URLs are preferred over URLs with dynamic parameters (e.g., https://azdentalclub.com/). Search engines prefer URLs that are easily readable and don't contain unnecessary parameters.
Hyphens Between Words: Use hyphens (-) rather than underscores (_) to separate words in the URL. Search engines treat hyphens as space, but underscores are not recognized as separators.
Avoid Stop Words: Consider omitting unnecessary stop words (e.g., "and," "or," "but") from your URLs. Focus on the main keywords that represent the page's content.
Be Consistent: Maintain a consistent URL structure across your site. Consistency makes it easier for both users and search engines to navigate and understand your website.
HTTPS: Ensure that your URLs use the secure HTTPS protocol. Google tends to favor secure websites, and HTTPS is a ranking factor.
While there's no strict character limit for URLs, it's generally advisable to keep them under 255 characters. This is because longer URLs may be truncated in search results, making them less user-friendly.
Remember that user experience is crucial, so prioritize creating URLs that are easy to read and understand. Additionally, focus on providing valuable content on your pages, as content quality is a key factor in SEO.
-
The proper URL length for SEO is generally recommended to be under 256 characters. It's important to keep your URLs concise and descriptive. Short and relevant URLs tend to perform better in search engine rankings and are easier for users to remember and share. Including relevant keywords in your URL can also help search engines and users understand the content of the page. Additionally, using hyphens to separate words in the URL is preferred over underscores or other special characters. Overall, aim for clear, concise, and keyword-rich URLs that accurately represent the content of your web pages.
-
50- 60 characters in a URL is good enough and will not be considered spam by Google. However, the vital aspect would be how you use the keywords and whether they are elegantly placed or one is stuffing it. Try to be as descriptive for the search engine, try to make it scannable and break it down.
Try to aim for a low-character URL because it is less likely to be mistaken as spam.
-
length can be detected as spam. You have to pay attention to the length.
-
The optimal length is 50-60 characters. If you're using a plugin like Rankmath or Yoast, they will also tell you which is optimum.
I'm following the Rankmath's guide to URL length and it's working perfectly and getting amazing results on my courier tracking website. -
It is crucial to consistently conduct competitor analysis, paying close attention to the length of their URLs.
A common mistake that many people make is incorporating long-tail keywords into their URLs, which is not considered a good SEO practice.
Personally, I strive to limit my site article URLs to a maximum of 4-5 words. In certain cases where the search volume is relatively low, I may include additional words, but the general best practice is to keep the URL as short as possible.
Once again, I cannot emphasize enough the importance of competitor analysis in shaping your approach.
-
When it comes to URL length for SEO, there is no definitive answer. However, it's generally recommended to keep URLs concise, include relevant keywords, avoid excessive parameters and unnecessary characters, use hyphens as word separators, maintain consistency, and prioritize usability and readability. Remember, URL length is just one factor among many that affect SEO.
-
Somewhere up to 75 characters max, from what I read. Longer than that could cause some difficulties in ranking.
-
While the length of a URL can have some impact on search engine optimization (SEO), it is generally recommended to keep URLs concise and relevant to the content of the page. URLs with fewer words tend to be easier for users to read and remember, and they also tend to be more user-friendly for sharing and linking purposes.
The impact of URL length on SEO is relatively small compared to other factors such as the quality and relevance of the content on your website, backlinks, site speed, user experience, and overall website optimization.
In terms of your specific scenario, where your competitors have 8-character domain URLs and keywords with a length of 13, and your site has a 15-character domain URL and keywords of the same length, it's unlikely that the slight difference in URL length alone would significantly impact your search engine rankings.
Google's algorithms consider numerous factors when determining the relevance and ranking of a website, and URL length is just one of them. It's important to focus on creating high-quality content, using relevant keywords, and ensuring a positive user experience on your website. These factors are likely to have a more substantial impact on your search engine rankings than the length of your URL.
-
I have tried to use proper URL length in my site but in some instances, long tail KWs mess it up. Then you have no option but a more than appropriate URL length
-
but sometimes the the long tail KW makes it difficult to have shorter URL length. for example "how many questions can you ask chatgpt"
-
When it comes to URL length in SEO (Search Engine Optimization), there is no strict rule for the maximum or ideal length. However, it's generally recommended to keep URLs concise, descriptive, and user-friendly. Here are some guidelines to consider:
Descriptive and Relevant: A URL should give users and search engines a clear idea of what the page is about. Including relevant keywords or a brief description of the content can help improve understanding and visibility.
Concise and Readable: Aim for shorter URLs that are easy to read and remember. Long, complex URLs can be confusing and difficult to share. Use hyphens (-) to separate words within the URL safe-ways and avoid using unnecessary characters, numbers, or special characters.
Avoid Keyword Stuffing: While it's important to include relevant keywords, avoid keyword stuffing in URLs. Maintain a natural flow and readability, and prioritize clarity over excessive keyword usage.
Maintain Consistency: Consistency in URL structure can benefit both users and search engines. Use a consistent format throughout your website, which can include using lowercase letters, eliminating unnecessary parameters, and organizing URLs in a logical and hierarchical manner.
-
@calvinkj Always analyze your competitors and analyze the length of their URLs.
Most people do big mistake and add long tail keyword in URL which isn't a good SEO practice.
I always add max. 4-5 words in URL for my site articles and in some articles where search volume is relatively lower, I do add more words but the best practice is have the shorter URL as possible.
Again, competitor analysis is the key
-
Some experience from words and hypehns in domain names
I used a hyphenated site www.octopus-energy-referral.co.uk and it is not doing too well compared to the non-hyphenated name. Similarly I have a site www.octopuscode.co.uk and it is doing really well compared to the hyphenated name because is is short and has fewer key words..
I know this is not a forensic comparison but I believe a non-hyphenated short name with fewer keywords is best if you have a choice.. -
If you haven't read this yet, please do (best practices for URLs).
So, it's a combination of things. As Devi Allen said, less is more. You want to use (and not over-use) descriptive words, separated by hyphens, "keeping URLs as simple, relevant, compelling, and accurate as possible". "To correctly render in all browsers, URLs must be shorter than 2,083 characters."
Which is better, your URL or your competitors? They sound pretty close based on your description but what matters is the actual words used in the URL, the site structure represented by that construct, whether the words truly represent what a visitor will find on the page, and whether the page content will provide visitors with the information they came looking for. URL length is but one of many factors that go into determining whether you or your competitor will rank higher.
-
You already answer it, less word is better.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Want to remove a large amount of links from spam sites. SEO company says we will lose a lot of link juice?
Hi, We have a lot of links that have a spam score above 30% and 60%. I don't know if someone has spammed our website. However our SEO company has said we should remove these carefully over a period of 3 months while they add new good links. I don't quite trust this advice. Are they trying to get more business?? They have put doubt in our mind. Can anyone please shed any light on this?? Thank you
White Hat / Black Hat SEO | | YvonneDupree0 -
How to save website from Negative SEO?
Hi, I have read couple of good blog post on Negative SEO and come to know about few solution which may help me to save my website during Negative SEO. Here, I want to share my experience and live data regarding Negative SEO. Someone is creating bad inbound links to my website. I come to know about it via Google webmaster tools. Honestly, I have implemented certain solutions like Google disavow tool, contact to certain websites and many more. But, I can see negative impact on organic visits. Organic visits are going down since last two months. And, I am thinking, These bad inbound links are biggest reasons behind it. You can visit following URLs to know more about it. Can anyone share your experience to save website from negative SEO? How can I save any website from Negative SEO (~Bad Inbound Links) https://docs.google.com/file/d/0BxyEDFdgDN-iR0xMd2FHeVlzYVU/edit https://drive.google.com/file/d/0BxyEDFdgDN-iMEtneXU1YmhWX2s/edit?usp=sharing https://drive.google.com/file/d/0BxyEDFdgDN-iSzNXdEJRdVJJVGM/edit?usp=sharing
White Hat / Black Hat SEO | | CommercePundit0 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
White Hat / Black Hat SEO | | jef22200 -
Subdomain and root domain effects on SEO
I have a domain lets say it's mydomain.com, which has my web app already hosted on this domain. I wanted to create a sub-product from my company, the concept is a bit different than my original web app that is on mydomain.com and I am planning to host this on mynewapp.mydomain.com. I am having doubts that using a sub-domain will have an impact on my existing or new web app. Can anyone give me any pointers on this? As much as I wanted to use a directory mydomain.com/mynewapp, this is not possible because it will just confuse existing users of the new product/web app. I've heard that subdomains are essentially treated as a new site, is this true? If it is then I am fine with this, but is it also true that subdomains are harder to reach the top rank rather than a root domain?
White Hat / Black Hat SEO | | herlamba0 -
SEO from INDIA Smarter then Google?
Exploring this url: filterscanada.ca In Open Site Explorer, it is clear those guys bought one of those package available on site like eLance.com a low price over seas!!! Where freelancer around the word can be hired for a fews bucks. For example, I post a job on eLance for SEO and most of the freelancer submitting where from INDIA. For just a few hundreds dollars, you can get a complet SEO package. At first, the price was attractive, but when posting on seoMoz and doing research, I came to the conclusion, the techniques they use might hurt more the produce positive result... How can you get a D.A. of 45 using backlinks they get? I read all those things about Google algorithm, and Panda and Penguin and this and that... Being impossible to crack! Do you have a explanation? I work really hard ans spends lots of $$$ to have a clean site selling furnace filters
White Hat / Black Hat SEO | | BigBlaze205
I follow all the SEO guide lines, practice only white hat trying to built somethings, but with a P.A. of 19 and a competitors ranking like this, I ask myself: "Maybe INDIA is smarter then Google and I should do like this site, spend a couple of hundreds dollars and buy myself a high D.A."0 -
Is my SEO consultant doing blackhat tactic?
Hi, Can someone tell me what my SEO consultant is doing? I have engaged a SEO company in Singapore for my site: Http://www.rollerblinds.com.sg The thing is for the 1st 2 months, it is ranking well but for the next 4 months it is out of google. I noticed my links are on some strange article site. Is he doing blackhat tactic, I have been paying monthly for many months now with no result. Shall I continue to pay and how can I recover?
White Hat / Black Hat SEO | | chanel270 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
What do you think of Theme pyramids for SEO?
Hi, Just been reading up on theme pyramids, I have seen these before but found a good article on the subject going into quite some detail. http://www.canonicalseo.com/theme-pyramids/ Using the word 'Pyramid' does scream black hat to me but looking at the structure, this must be the best way for internal linking. Even the keyword structure looks good, Example: homepage - shoes category - red shoes sub category - size 7 red shoes Building anchor text links for shoes, red shoes or size 7 red shoes will benefit all 3 terms. Negative/Positive comments please.
White Hat / Black Hat SEO | | activitysuper0