What is the proper URL length? in seo
-
i learned that having 50 to 60 words in a url is ok and having less words is preferable by google.
but i would like to know that as i am gonna include keywords in the urls and i am afraid it will increase the length. is it gonna slighlty gonna hurt me?
my competitors have 8 characters domain url and keywords length of 13
and my site has 15 character domain url and keywords length of 13
which one will be prefered by google.
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
Well to me a proper url link shouldnt be more than 50 in lenght.. i use more below or rather about 60 on my website Timetocare
-
In terms of SEO (Search Engine Optimization), while there's no strict rule for the optimal URL length, it's generally recommended to keep URLs concise, descriptive, and user-friendly. Here are some guidelines and considerations:
-
Short and Descriptive:
- Aim for short and descriptive URLs that give users and search engines a clear idea of the page's content. Avoid unnecessary parameters or overly complex structures.
-
Keywords:
- Include relevant keywords in the URL, especially in the domain and the path. This can help search engines understand the topic of the page.
-
Readability:
- Keep URLs readable by using hyphens to separate words instead of underscores. For example, use "example.com/important-page" instead of "example.com/important_page."
-
Avoid Dynamic Parameters:
- If possible, avoid using dynamic parameters in URLs (e.g., "example.com/page?id=123"). Static, keyword-rich URLs are generally more SEO-friendly.
-
Consistency:
- Maintain consistency in your URL structure across your website. This helps both users and search engines navigate and understand the organization of your content.
-
301 Redirects for Changes:
- If you need to change a URL, use 301 redirects to inform search engines that the content has permanently moved. This preserves SEO value.
-
Limit Length:
- While there's no strict character limit for URLs, it's advisable to keep them reasonably short, ideally under 100 characters. Shorter URLs are easier to remember and share.
-
HTTPS:
- Use HTTPS for secure connections. Search engines tend to favor secure websites, and HTTPS is considered a ranking factor.
Remember that the primary goal is to create URLs that are user-friendly and provide a clear indication of the content. Search engines use URLs to understand the context and relevance of a page, so optimizing them for readability and keywords can positively impact your SEO efforts. Additionally, creating a logical URL structure helps users navigate your site more easily.
-
-
The ideal URL length for SEO is typically under 60 characters. Shorter URLs are easier for search engines to crawl and for users to read and remember. Keeping URLs concise, relevant to the page content, and including keywords can positively impact SEO performance. Avoid lengthy URLs with unnecessary parameters or characters.
-
The appropriate page URL is 75 characters length. And the maximum length of URL in the address bar is 2049 characters. For more info. like this click here.
-
In SEO, there is no strict rule for an ideal URL length, but it's generally recommended to keep URLs concise, relevant, and user-friendly. Here are some guidelines to consider:
Short and Descriptive: Aim for short and descriptive URLs that give users and search engines a clear idea of the page's content. A concise URL is easier to remember and share.
Include Keywords: If possible, include relevant keywords in your URL. This can contribute to the page's SEO, but don't over-optimize by stuffing too many keywords.
Avoid Dynamic Parameters: Clean, static URLs are preferred over URLs with dynamic parameters (e.g., https://azdentalclub.com/). Search engines prefer URLs that are easily readable and don't contain unnecessary parameters.
Hyphens Between Words: Use hyphens (-) rather than underscores (_) to separate words in the URL. Search engines treat hyphens as space, but underscores are not recognized as separators.
Avoid Stop Words: Consider omitting unnecessary stop words (e.g., "and," "or," "but") from your URLs. Focus on the main keywords that represent the page's content.
Be Consistent: Maintain a consistent URL structure across your site. Consistency makes it easier for both users and search engines to navigate and understand your website.
HTTPS: Ensure that your URLs use the secure HTTPS protocol. Google tends to favor secure websites, and HTTPS is a ranking factor.
While there's no strict character limit for URLs, it's generally advisable to keep them under 255 characters. This is because longer URLs may be truncated in search results, making them less user-friendly.
Remember that user experience is crucial, so prioritize creating URLs that are easy to read and understand. Additionally, focus on providing valuable content on your pages, as content quality is a key factor in SEO.
-
The proper URL length for SEO is generally recommended to be under 256 characters. It's important to keep your URLs concise and descriptive. Short and relevant URLs tend to perform better in search engine rankings and are easier for users to remember and share. Including relevant keywords in your URL can also help search engines and users understand the content of the page. Additionally, using hyphens to separate words in the URL is preferred over underscores or other special characters. Overall, aim for clear, concise, and keyword-rich URLs that accurately represent the content of your web pages.
-
50- 60 characters in a URL is good enough and will not be considered spam by Google. However, the vital aspect would be how you use the keywords and whether they are elegantly placed or one is stuffing it. Try to be as descriptive for the search engine, try to make it scannable and break it down.
Try to aim for a low-character URL because it is less likely to be mistaken as spam.
-
length can be detected as spam. You have to pay attention to the length.
-
The optimal length is 50-60 characters. If you're using a plugin like Rankmath or Yoast, they will also tell you which is optimum.
I'm following the Rankmath's guide to URL length and it's working perfectly and getting amazing results on my courier tracking website. -
It is crucial to consistently conduct competitor analysis, paying close attention to the length of their URLs.
A common mistake that many people make is incorporating long-tail keywords into their URLs, which is not considered a good SEO practice.
Personally, I strive to limit my site article URLs to a maximum of 4-5 words. In certain cases where the search volume is relatively low, I may include additional words, but the general best practice is to keep the URL as short as possible.
Once again, I cannot emphasize enough the importance of competitor analysis in shaping your approach.
-
When it comes to URL length for SEO, there is no definitive answer. However, it's generally recommended to keep URLs concise, include relevant keywords, avoid excessive parameters and unnecessary characters, use hyphens as word separators, maintain consistency, and prioritize usability and readability. Remember, URL length is just one factor among many that affect SEO.
-
Somewhere up to 75 characters max, from what I read. Longer than that could cause some difficulties in ranking.
-
While the length of a URL can have some impact on search engine optimization (SEO), it is generally recommended to keep URLs concise and relevant to the content of the page. URLs with fewer words tend to be easier for users to read and remember, and they also tend to be more user-friendly for sharing and linking purposes.
The impact of URL length on SEO is relatively small compared to other factors such as the quality and relevance of the content on your website, backlinks, site speed, user experience, and overall website optimization.
In terms of your specific scenario, where your competitors have 8-character domain URLs and keywords with a length of 13, and your site has a 15-character domain URL and keywords of the same length, it's unlikely that the slight difference in URL length alone would significantly impact your search engine rankings.
Google's algorithms consider numerous factors when determining the relevance and ranking of a website, and URL length is just one of them. It's important to focus on creating high-quality content, using relevant keywords, and ensuring a positive user experience on your website. These factors are likely to have a more substantial impact on your search engine rankings than the length of your URL.
-
I have tried to use proper URL length in my site but in some instances, long tail KWs mess it up. Then you have no option but a more than appropriate URL length
-
but sometimes the the long tail KW makes it difficult to have shorter URL length. for example "how many questions can you ask chatgpt"
-
When it comes to URL length in SEO (Search Engine Optimization), there is no strict rule for the maximum or ideal length. However, it's generally recommended to keep URLs concise, descriptive, and user-friendly. Here are some guidelines to consider:
Descriptive and Relevant: A URL should give users and search engines a clear idea of what the page is about. Including relevant keywords or a brief description of the content can help improve understanding and visibility.
Concise and Readable: Aim for shorter URLs that are easy to read and remember. Long, complex URLs can be confusing and difficult to share. Use hyphens (-) to separate words within the URL safe-ways and avoid using unnecessary characters, numbers, or special characters.
Avoid Keyword Stuffing: While it's important to include relevant keywords, avoid keyword stuffing in URLs. Maintain a natural flow and readability, and prioritize clarity over excessive keyword usage.
Maintain Consistency: Consistency in URL structure can benefit both users and search engines. Use a consistent format throughout your website, which can include using lowercase letters, eliminating unnecessary parameters, and organizing URLs in a logical and hierarchical manner.
-
@calvinkj Always analyze your competitors and analyze the length of their URLs.
Most people do big mistake and add long tail keyword in URL which isn't a good SEO practice.
I always add max. 4-5 words in URL for my site articles and in some articles where search volume is relatively lower, I do add more words but the best practice is have the shorter URL as possible.
Again, competitor analysis is the key
-
Some experience from words and hypehns in domain names
I used a hyphenated site www.octopus-energy-referral.co.uk and it is not doing too well compared to the non-hyphenated name. Similarly I have a site www.octopuscode.co.uk and it is doing really well compared to the hyphenated name because is is short and has fewer key words..
I know this is not a forensic comparison but I believe a non-hyphenated short name with fewer keywords is best if you have a choice.. -
If you haven't read this yet, please do (best practices for URLs).
So, it's a combination of things. As Devi Allen said, less is more. You want to use (and not over-use) descriptive words, separated by hyphens, "keeping URLs as simple, relevant, compelling, and accurate as possible". "To correctly render in all browsers, URLs must be shorter than 2,083 characters."
Which is better, your URL or your competitors? They sound pretty close based on your description but what matters is the actual words used in the URL, the site structure represented by that construct, whether the words truly represent what a visitor will find on the page, and whether the page content will provide visitors with the information they came looking for. URL length is but one of many factors that go into determining whether you or your competitor will rank higher.
-
You already answer it, less word is better.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recovering from Black Hat/Negative SEO with a twist
Hey everyone, This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you. Scenario In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to. We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages. Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site. Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them. Next Steps? The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are: Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1) Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795) Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? Would love some input or examples from anyone who can help, thanks in advance!
White Hat / Black Hat SEO | | Etna0 -
Is this negative SEO? Should I disavow these links?
We have been doing our own internal link building for the last year and getting nice backlinks. As of the last few days, ahrefs is showing a lot of new links that seem very spammy. We have not hired anyone to do link building for us, and these are all being created on these sites under the same user name. There is a good amount of them popping up, and I fear we will be subjected to a google pentalty for unnatural links if its not addressed. My first question is, am I correct thinking this is negative seo, and not some random sites that picked up our content and is going across their affiliate websites? If so, then should I preemptively disavow all these links? Are there any good ways to stop this? How can I track who is placing these garbage links? Here are some examples of these bad links. I know I can find the webmaster via a whois but I think that really wont get me anywhere, but I could be wrong. Here are some examples of the links that started popping up yesterday and today. http://pligg-cms.info/story.php?title=student-loan-debt-relief
White Hat / Black Hat SEO | | DemiGR
http://www.sharklinks.info/story.php?title=-student-loan-consolidation-options
http://factson37.com/story.php?title=student-loan-debt-forgiveness-website
http://social-marker.info/story.php?title=-student-loan-debt-forgiveness
http://makingbookmarks.info/story.php?title=-student-loan-consolidation-options
http://bookmarkingforseo.com/story.php?title=top-student-loan-consolidation-options
http://jadelinks.info/story.php?title=-student-loan-consolidation-options There are quite a bit more and they don't seem to be stopping. All of them look pretty much identical to this. Thoughts?1 -
LOCAL SEO / Ranking for the difficult 'service areas' outside of the primary location?
It's generally not too hard to rank in Google Places and organically for your primary location. However if you are a service area business looking to rank for neighboring cities or service areas, Google makes this much tougher. Andrew Shotland mentions the obvious and not so obvious options: Service Area pages ranking organically, getting a real/virtual address, boost geo signals, and using zip codes instead of service area circle. But I am wondering if anyone had success with other methods? Maybe you have used geo-tagging in a creative way? This is a hurdle that many local business are struggling with and any experience or thoughts will be much appreciated
White Hat / Black Hat SEO | | vmialik1 -
Is using Zeus's gateway feature to display contents from the different URL OK to do?
I've been writing a blog on free hosting blog platform and planning to migrate that under my domain name as directory. myblog.ABCD.com to www.mydomain.com/myblog now, I've learned that my Zeus server has a way to show myblog.ABCD.com at mydomain.com/myblog without transferring anything by using the Gateway feature. This will save a lot of time and hassle for me, but my question is if this is ok to do?
White Hat / Black Hat SEO | | HypermediaSystems
Is there a chance that this could be considered a blackhat even though the content is mine? From the Zeus documentation:
"Gateway aliases enable users to request files from the new
web server, and receive them as if they were on the new server, when they are
still located on the legacy server. To the user, the files appear to be located on
the new server. " Thank you.0 -
Is this a clear sign that one of our competitors is doing some serious black-hat SEO?
One of our competitors just recently increased their total external followed looks pretty drastically. Is it safe to say they are doing some pretty black-hat stuff? What actions exactly could this be attributed to? They've been online and in business for 10+ years and I've seen some pretty nasty drops in traffic on compete.com for them over the years. If this is black-hat work in action, would these two things be most likely related? Wh10b97
White Hat / Black Hat SEO | | Kibin0 -
Knowledge Graph SEO Factors
I notice when I search for my clients brand name it pulls up the Google local info and Google+ stuff, knowledge graph etc, as well as a section at the bottom, 'People Also Search For' and lists a number of the clients competitors. However when I search one of the competitors no Google local or knowledge graph stuff comes up. Client obviously wants to limit promotion of the competitors. Does anyone have any experience with this? I know Google Author rank seems to play a factor in knowledge graph results? Are the competitors doing anything on their end SEO wise? What can be done to limit this? Thanks for any help! jkn0BMT.png
White Hat / Black Hat SEO | | EmarketedTeam0 -
Off-page SEO and link building
Hi everyone! I work for a marketing company; for one of our clients' sites, we are working with an independent SEO consultant for on-page help (it's a large site) as well as off-page SEO. Following a meeting with the consultant, I had a few red flags with his off-page practices – however, I'm not sure if I'm just inexperienced and this is just "how it works" or if we should shy away from these methods. He plans to: guest blog do press release marketing comment on blogs He does not plan to consult with us in advance regarding the content that is produced, or where it is posted. In addition, he doesn't plan on producing a report of what was posted where. When I asked about these things, he told me they haven't encountered any problems before. I'm not saying it was spam-my, but I'm more not sure if these methods are leaning in the direction of "growing out of date," or the direction of "black-hat, run away, dude." Any thoughts on this would be crazy appreciated! Thanks, Casey
White Hat / Black Hat SEO | | CaseyDaline0