Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
All page files in root? Or to use directories?
-
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /.
For example:
/aosta-valley-i6816.html
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.htmlWe are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following:
/images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.htmlWould we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory:
/images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/
Just looking for some clarity to our problem!
Thank you for your help guys!
-
To my knowledge there hasn't been a definitive conclusion on this one.
The general advice as I know it seems to be: they are equally good, pick one, and make sure the other one (with slash if you choose to go for 'without slash' or vice versa) redirects to the chosen one (to avoid duplicate content).
-
I would personally place the keywords at the end for clarity. It indeed seems unnatural to have the id as the final part of the URL. Even if that does indeed cost you a tiny bit of 'keyword power', I would glady sacrifice that in exchange for a more user-friendly URL.
Limiting the amount of words in the URL does indeed make it look slightly less spammy, but slightly less user friendly as well. I guess this is just one of those 'weigh the pros/cons and decide for yourself'. Just make sure the URLs don't get rediculously long.
-
OK, so I have taken it upon myself to now have our URLs as follows:
/news/853/free-flight-simulator/
Anything else gets 301'd to the correct URL. /news/853/free-flight-simulator would be 301'd to /news/853/free-flight-simulator/ along with /news/853/free-flight-sifsfsdfdsfmulator/ ... etc.
-
Also, trailing slash? Or no trailing slash?
Without
/downloads/878/fsx-concorde
With
/downloads/878/fsx-concorde/
-
Dear Theo,
Thank you for your response - i found your article very interesting.
So, just to clarify - in our case, the best URL method would be:
/images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/This would remove the suffixes and also have the ID numbers at the end; placing the target keywords closer to the root of the URL; which makes a very slight difference...
EDIT: Upon thinking about it, I feel that the final keyword-targeted page would be more natural if it appeared at the end of the URL. For example: /images/6816/aosta-valley/ (like you have done on your blog).
Also, should I limit the amount of hyphenated words in the URL? For example in your blog, you have /does-adding-a-suffix-to-my-urls-affect-my-seo/ - perhaps it would be more concentrated and less spammy as /adding-suffix-urls-affect-seo/ ?
Let me know your thoughts.
Thank you for your help!
-
Matt Cutts states that the number of subfolders 'it is not a major factor': http://www.youtube.com/watch?v=l_A1iRY6XTM
Furthermore, a blog I wrote about removing suffixes: http://www.finishjoomla.com/blog/5/does-adding-a-suffix-to-my-urls-affect-my-seo/
Another Matt Cutts regarding your seperate question about the keyword order: http://www.youtube.com/watch?v=gRzMhlFZz9I
Having some structure (in the form of a single subfolder) would greatly add to the usability of your website in my opinion. If you can manage to use the correct redirects (301) from your old pages to your new ones, I wouldn't see a clear SEO related reason not to switch.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it ok to repeat a (focus) keyword used on a previous page, on a new page?
I am cataloguing the pages on our website in terms of which focus keyword has been used with the page. I've noticed that some pages repeated the same keyword / term. I've heard that it's not really good practice, as it's like telling google conflicting information, as the pages with the same keywords will be competing against each other. Is this correct information? If so, is the alternative to use various long-winded keywords instead? If not, meaning it's ok to repeat the keyword on different pages, is there a maximum recommended number of times that we want to repeat the word? Still new-ish to SEO, so any help is much appreciated! V.
Intermediate & Advanced SEO | | Vitzz1 -
Page rank and menus
Hi, My client has a large website and has a navigation with main categories. However, they also have a hamburger type navigation in the top right. If you click it it opens to a massive menu with every category and page visible. Do you know if having a navigation like this bleeds page rank? So if all deep pages are visible from the hamburger navigation this means that page rank is not being conserved to the main categories. If you click a main category in the main navigation (not the hamburger) you can see the sub pages. I think this is the right structure but the client has installed this huge menu to make it easier for people to see what there is. From a technical SEO is this not bad?
Intermediate & Advanced SEO | | AL123al0 -
URL structure - Page Path vs No Page Path
We are currently re building our URL structure for eccomerce websites. We have seen a lot of site removing the page path on product pages e.g. https://www.theiconic.co.nz/liberty-beach-blossom-shirt-680193.html versus what would normally be https://www.theiconic.co.nz/womens-clothing-tops/liberty-beach-blossom-shirt-680193.html Should we be removing the site page path for a product page to keep the url shorter or should we keep it? I can see that we would loose the hierarchy juice to a product page but not sure what is the right thing to do.
Intermediate & Advanced SEO | | Ashcastle0 -
Using hreflang for international pages - is this how you do it?
My client is trying to achieve a global presence in select countries, and then track traffic from their international pages in Google Analytics. The content for the international pages is pretty much the same as for USA pages, but the form and a few other details are different due to how product licensing has to be set up. I don’t want to risk losing ranking for existing USA pages due to issues like duplicate content etc. What is the best way to approach this? This is my first foray into this and I’ve been scanning the MOZ topics but a number of the conversations are going over my head,so suggestions will need to be pretty simple 🙂 Is it a case of adding hreflang code to each page and creating different URLs for tracking. For example:
Intermediate & Advanced SEO | | Caro-O
URL for USA: https://company.com/en-US/products/product-name/
URL for Canada: https://company.com/en-ca/products/product-name /
URL for German Language Content: https://company.com/de/products/product-name /
URL for rest of the world: https://company.com/en/products/product-name /1 -
Different Header on Home Page vs Sub pages
Hello, I am an SEO/PPC manager for a company that does a medical detox. You can see the site in question here: http://opiates.com. My question is, I've never heard of it specifically being a problem to have a different header on the home page of the site than on the subpages, but I rarely see it either. Most sites, if i'm not mistaken, use a consistent header across most of the site. However, a person i'm working for now said that she has had other SEO's look at the site (above) and they always say that it is a big SEO problem to have a different header on the homepage than on the subpages. Any thoughts on this subject? I've never heard of this before. Thanks, Jesse
Intermediate & Advanced SEO | | Waismann0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0