URL best practices, use folders or not ?
-
Hi
I have a question about URLs. Client have all URL written after domain and have only one / slash in all URLs. Is this best practice or i need to use categories,folders?
Thanks
-
It's a trade-off, for both SEO and users, and I don't think there's one answer that fits every situation. The category level can add information, but it also makes URLs longer, which can be bad for both bots and people. If you have short, descriptive categories that aren't repeated in the product/page names, and those categories mimic your site structure, then I think it can be positive.
My argument was mostly against people adding categories just for SEO benefit (it's probably minimal, at best) or repeating every category, sub-category, etc. to the point of absurdity, causing keyword cannibalization and massive URLs. For example:
www.bobscamerashop.com/cameras/digital-cameras/canon-cameras/eos-cameras/camera-canon-eos-rebel-t3
Of course, that's also keyword stuffed, but I'm exaggerating to prove a point. You can go too far in either direction.
In general, though, I don't think categories in the URL are necessarily bad. In some cases, as Woj said, they could be a positive for users and possible even SEO.
-
Think about it from the user's point of view. What would work best for them? Maybe even get some feedback from some site users if possible
-
Will the site categories/products grow? If so, then the slash could be used to organise the structure & prepare for the future
In the example, you presented:
- www.example.com/accounts-titanium
- www.example.com**/**accounts/titanium
These are the same length & make no real difference
When we compare these 2, however:
You can see that #1 is shorter, doesn't repeat keyword (even though they are plural) & would be more likely clicked in the SERPs
Does that help some more?
-
http://www.seomoz.org/blog/should-i-change-my-urls-for-seo
In this article point 2 is saying that the unstructured is better so i`m confused.
-
the site is small about 60 pages and max depth level is 3
-
I'd use folders or categories if the amount of products/items is large and/or going to expand
If it's a small amount & finite then make the URLs as short as possible
-
Information architecture is important from a usability and search engine prospective.
I'd say go for the categories divided by the /
www.example.com**/**accounts/titanium
www.example.com/accounts/open-demo-accountThis makes more sense and lends itself to scalability etc.
hope this helps.
there are some really good articles on information architecture on the seomoz and the web
-
URL is without any category or folder
www.example.com/accounts-titanium
www.example.com/open-demo-account
is this right or i need to use:
www.example.com**/**accounts/titanium
-
not quite sure what you mean exactly - can you expand with and example?
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
With generic product like screws, for example what is best practice when writing descriptions? It's tough writing unique content for something when the only difference is lengths
With generic product like screws, for example what is best practice when writing descriptions? It's tough writing unique content for something when the only difference is lengths
On-Page Optimization | | Jacksons_Fencing1 -
H1 tag- on home page - what is it best to include
is it best to have in the H1 tag 1. just our website address 2. combination of website address followed by short keywords about our website
On-Page Optimization | | CostumeD0 -
Best practice for a website where the publications (catalogues) expire frequently
Hi guys, Hope to find some smart SEOs around here and where I can to contribute to the community! I wanted to share an issue we are having with our website, which is focused on the promotional flyers of the supermarket chains. These are often valid for a short period of time and then expire. So it makes no sense to keep them but also, if we delete them, that would create a lot of 404s or something. So how do you think it is the best way to go around this problem? should we delete them? or when deleted to redirect the dead pages to the homepage? or keep them in an archive live site.com/brand/archive/ ? Thank you so much for any opinions, much appreciated! Best
On-Page Optimization | | commissionshare0 -
URL Domain Used in Meta Description
Today I was asked if using a domain url in your meta description can have a negative impact on your website. This description includes a list of the homepage url, but directs visitors to a different internal page of the website. My concern fell with directing visitors to a different page of the site, but promoting the homepage in the description/snippet. With Penguin 2.1 release on the 4th, I'm very cautious of my links/urls. What are your thoughts behind this? What are the possible, if any negative impacts this could have on a site? This URL does have a brand name as so the Title.
On-Page Optimization | | flcity150 -
Should I rewrite all my URLs ?
Hi all, I'm pretty new here and this is a question I'm struggling with since years ! All my URLs are very long. Years ago I wanted to put as many keywords as possible but today I'm not sure anymore it was such a good idea. Example: http://www.spirit-of-metal.com/album-groupe-Take_Me_To_Janus-nom_album-Ripping_the_Heart_from_the_Chest_of_the_Earth-l-en.html The problem is I have more than 300K of these pages. I'm afraid to create a huge mess even if I 301 them all to the new pages. What's your opinion ? Is it worth the effort ? Many thanks in advance for your precious help !
On-Page Optimization | | kivanSOM0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
Keyword in url, which way better?
Hello, is there a difference between urls for targeting keyword "brazil tourist visa" fastbrazilvisas.com/tourist or fastbrazilvisas.com/brazil-tourist-visa ? ran the report In-Page Optimization it tells "no keyword usage in url". is there an idea behind that? thanks
On-Page Optimization | | Kotkov0 -
Are xml sitemaps still in use today?
Hi, Are you still using XML sitemaps today? If yes, does it bring any benefit to your website like faster indexing of webpages or better rankings? Are you using special features like video sitemaps or sitemap index files? Best regards, Tobias
On-Page Optimization | | Tobiask-1215731