SEO url best practices
-
We're revamping our site architecture and making several services pages that are accessible from one overarching service page. An example would be as follows:
Services
- Student Services
- Essay editing
- Essay revision
- Author Services
- Book editing
- Manuscript critique
We'll also be putting breadcrumbs throughout the site for easy navigation, however, is it imperative that we build the URLs that deep? For example, could we simply have www.site.com/essay-editing rather than www.site.com/services/students/essay-editing?
I prefer the simplicity of the former, but I feel the latter may be more "search robot friendly" and better for SEO.
Any advice on this is much appreciated.
-
Thanks donford, that's very helpful.
After thinking it over, I feel it's best to keep the urls as simple as possible and use something like /s/essay-editing for them (the 's' representing services).
Thanks!
-
Hi Kibin,
Based on your situation the 2 things of URL BEST PRACTICES at odds with each other are:
Length vs Content
I would say depending on the average overall depth you should be perfectly fine and likely see benefits from a strategy like "www.site.com/services/students/essay-editing" as this is only 3 layers deep. At some point however, there is no benefit other then folder organization to having long urls.
If you forsee your site getting over 5 levels of deepness you may want to consider a different structure. Long urls especially those containing URL parameters can cause crawl issues. There are 2 basic thoughts on urls; 1 can a user understand the url, and 2 will the crawlers be able to navigate the url and index it correctly? You want to design for the users first while keeping in mind the way Search Engines will view it.
Finally about the difference between
www.site.com/services/students/essay-editing
and
www.site.com/essay-editingWhat you miss out on the latter is long tail keyword opportunities ie..(student essay editing, student services essay editing). Those still can be incorporated into the content of the page and likely will with the breadcrumbs, but they will have a tad more power by having the keyword in the url.
Think of the user of the site first, then the search engines, then the backend administration.
As a user I like the short url but from an administration and SEO perspective I like the longer urls.
Hope that helps,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should we change our URLs for SEO benefit?
Hi, I'm currently covering a maternity marketing role at i-escape and one our main objectives is to increase organic traffic to the website. i-escape has a selection of hand-picked boutique hotels, villas, lodges, guesthouses and apartments for people to discover and book. At the moment each hotel page URL follows this structure: https://www.i-escape.com/hotelname We'd like to change this to include some searchable words in the URL dependent on the type of hotel. For example: https://www.i-escape.com/boutique-hotels/hotelname or https://www.i-escape.com/boutique-apartments/hotelname If we do go ahead, we know we need to make sure all old style URLs canonically redirect to the new style. Is having the keyword in the URL important enough for us to change over 1500 URLs on the website? We have quite a high quality links pointing to these hotel pages URLs. Also, will this help us with navigation/user journeys/crawls as there will be a /boutique-hotels/hotelname rather than just /hotelname? Thanks so much all! Clair
Technical SEO | | iescape0 -
Best way to handle URLs of the to-be-translated pages on a multilingual site
Dear Moz community, I have a multilingual site and there are pages with content that is supposed to be translated but for now is English only. The structure of the site is such that different languages have their virtual subdirs: domain.com/en/page1.html for English, domain.com/fr/page1.html for French and so on. Obviously, if the page1.html is not translated, the URLs point to the same content and I get warnings about duplicate content. I see two ways to handle this situation: Break the naming scheme and link to original English pages, i.e. instead of domain.com/fr/index.html linking to domain.com/fr/page1.html link to domain.com/en/page.html Leave the naming scheme intact and set up a 301 redirect so that /fr/page1.html redirects to /en/page1.html Is there any difference for the two methods from the SEO standpoint? Thanks.
Technical SEO | | Lomar0 -
Coming soon SEO
Hi, I was wondering what is the best practice to redirect all the links juice by redirecting all the pages of your website to a coming soon page. The coming soon page will point to the domain.com, not to a subfolder. Should I move the entire website to a subfolder and redirect this folder to the coming soon page? Thanks
Technical SEO | | bigrat950 -
Best practice to handle Wordpress Categories/Tags
Hello Mozzers, I am sure a lot of people here are using wordpress. How do you handle Categories & Tags? I came across that they produce a lot of duplicate content in the google index. My website is brand new so I don't have any traffic yet, how would you handle it? noindex, follow? Or block /categories/ and /tags/ from robots.txt? Probably I am completely wrong with both ways? I am grateful for your answers! Best regards!
Technical SEO | | grobro0 -
Should I make a new URL just so it can include a target keyword, then 301 redirect the old URL?
This is for an ecommerce site, and the company I'm working with has started selling a new line of products they want to promote.Should I make a new URL just so it can include a target keyword, then 301 redirect the old URL? One of my concerns is losing a little bit of link value from redirecting. Thank you for reading!
Technical SEO | | DA20130 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Friendly URLs
Hi, I have an important news site and I am trying to implement user friendly URLs. Now, when you click a news in the homepage, it goes to a redirect.php page and then goes to a friendly url. the question is, It is better to have the friendly URL in the first link or it is the same for the robot having this in the finally url? Thanks
Technical SEO | | informatica8100 -
Overly Dynamic URLs
I have a site that I use to time fitness events and I like to post the results using query strings. I create a link to each event's results/gallery/etc. I don't need these pages crawled and I don't want them to hurt my seo. Can I put a "do not crawl" meta on them or will that hurt my overall positioning? What are my other options?
Technical SEO | | bobbabuoy0