Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Flat architecture or deep folders?
-
We have an e-commerce client that is launching a new site. In setting up for it they decided that they want to change their navigation on the site and url structure. So everything being even, the new site will have appropriate 301 and it's built on Magento so the product pages are all structured as website.com/product-A but the category pages will now be deeper than before. So before it was website.com/product-category/product-sub-category will now be website.com/more generic category/product-category/new-subcategory/product-category. Hope that makes sense. I'm not as worried about the 301's or specific products but I'm worried the category pages dropping a folder level will hurt page authority. Any thoughts, am I being overly nervous?
-
Hi Bill,
You are rightly worried about the change in category structure, the structure earlier looks better than new structure of category page URL (before it was website.com/product-category/product-sub-category will now be website.com/more generic category/product-category/new-subcategory/product-category ).
The idea of right page URL structure should be to keep the length of URL in control and also give show the right directory structure to the search engines. It seems with the new structure you can get lengthy URLs and also repeat lot of keywords within the URL.
Having said that, you should read these two articles for further help
https://moz.com/blog/15-seo-best-practices-for-structuring-urls
https://moz.com/learn/seo/urlI hope this helps, feel free to respond and ask further.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can some sort of wildcard redirect be used on a single folder path?
We have a directory with thousands of pages and we are migrating the entire site to another root URL. These folder paths will not change on the new site, but we don't want to use a wildcard to redirect EVERYTHING to the same folder path on the new site. Setting up manual 301 redirects on this particular directory would be crazy. Is there a way to isolate something like a wildcard redirect to apply only to a specific folder? Thanks!
Intermediate & Advanced SEO | | MJTrevens0 -
What are best page titles for sub-folders or sub-directories? Same as website?
Hi all, We always mention "brand & keyword" in every page title along with topic in the website, like "Topic | vertigo tiles". Let's say there is a sub-directory with hundreds of pages...what will be the best page title practice in mentioning "brand & keyword" across all pages of sub-directory to benefit in-terms if SEO? Can we add "vertigo tiles" to all pages of sub-directory? Or we must not give same phrase? Thanks,
Intermediate & Advanced SEO | | vtmoz0 -
What are the best practices for geo-targeting by sub-folders?
My domain is currently targeting the US, but I'm building out sub-folders that will need to geo-target France, England, and Spain. Each country will have it's own sub-folder, and professionally translated (domain.com/france). Other than the hreflang tags, what are other best practices I can implement? Can Google Webmaster tools geo-target by subfolder? Any suggestions would be appreciated. Thanks Justin
Intermediate & Advanced SEO | | Rhythm_Agency0 -
How to fully index big ecommerce websites (that have deep catalog hierarchy)?
When building very large ecommerce sites, the catalog data can have millions of product SKUs and a massive quantity of hierarchical navigation layers (say 7-10) to get to those SKUs. On such sites, it can be difficult to get them to index substantially. The issue doesn’t appear to be product page content issues. The concern is around the ‘intermediate’ pages -- the many navigation layers between the home page and the product pages that are necessary for a user to funnel down and find the desired product. There are a lot of these intermediate pages and they commonly contain just a few menu links and thin/no content. (It's tough to put fresh-unique-quality content on all the intermediate pages that serve the purpose of helping the user navigate a big catalog.) We've played with NO INDEX, FOLLOW on these pages. But structurally it seems like a site with a lot of intermediate pages containing thin content can result in issues such as shallow site indexing, weak page rank, crawl budget issues, etc. Any creative suggestions on how to tackle this?
Intermediate & Advanced SEO | | AltosDigital-10 -
Dilemma about "images" folder in robots.txt
Hi, Hope you're doing well. I am sure, you guys must be aware that Google has updated their webmaster technical guidelines saying that users should allow access to their css files and java-scripts file if it's possible. Used to be that Google would render the web pages only text based. Now it claims that it can read the css and java-scripts. According to their own terms, not allowing access to the css files can result in sub-optimal rankings. "Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings."http://googlewebmastercentral.blogspot.com/2014/10/updating-our-technical-webmaster.htmlWe have allowed access to our CSS files. and Google bot, is seeing our webapges more like a normal user would do. (tested it in GWT)Anyhow, this is my dilemma. I am sure lot of other users might be facing the same situation. Like any other e commerce companies/websites.. we have lot of images. Used to be that our css files were inside our images folder, so I have allowed access to that. Here's the robots.txt --> http://www.modbargains.com/robots.txtRight now we are blocking images folder, as it is very huge, very heavy, and some of the images are very high res. The reason we are blocking that is because we feel that Google bot might spend almost all of its time trying to crawl that "images" folder only, that it might not have enough time to crawl other important pages. Not to mention, a very heavy server load on Google's and ours. we do have good high quality original pictures. We feel that we are losing potential rankings since we are blocking images. I was thinking to allow ONLY google-image bot, access to it. But I still feel that google might spend lot of time doing that. **I was wondering if Google makes a decision saying, hey let me spend 10 minutes for google image bot, and let me spend 20 minutes for google-mobile bot etc.. or something like that.. , or does it have separate "time spending" allocations for all of it's bot types. I want to unblock the images folder, for now only the google image bot, but at the same time, I fear that it might drastically hamper indexing of our important pages, as I mentioned before, because of having tons & tons of images, and Google spending enough time already just to crawl that folder.**Any advice? recommendations? suggestions? technical guidance? Plan of action? Pretty sure I answered my own question, but I need a confirmation from an Expert, if I am right, saying that allow only Google image access to my images folder. Sincerely,Shaleen Shah
Intermediate & Advanced SEO | | Modbargains1 -
PDFs and images in Sub folder or subdomain?
What would you recommend as best practice? Our ecommerce site has a lot of PDFs supporting the product page. Currently they are kept in a sub domain and so are all images. Would it be better to keep them all in a subfolder? I've read about blogs being hosted on a subfolder to be better than subdomain but what about pdfs and images? thoughts?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
What is better for SEO keywords in folder or in filename - also dupe filename question
Hey folks, I've got a question regarding URL structure. What is best for SEO given that there will be millions of lawyer names and 4 pages per lawyer www.lawyerz.com/office-locations/dr-al-pacino www.lawyerz.com/phone-number/dr-al-pacino www.lawyerz.com/reviews/dr-al-pacino www.lawyerz.com/ratings/dr-al-pacino OR www.lawyerz.com/office-locations-dr-al-pacino www.lawyerz.com/phone-number-dr-al-pacino www.lawyerz.com/reviews-dr-al-pacino www.lawyerz.com/ratings-dr-al-pacino OR www.lawyerz.com/dr-al-pacino/office-locations www.lawyerz.com/dr-al-pacino/phone-number www.lawyerz.com/dr-al-pacino/reviews www.lawyerz.com/dr-al-pacino/ratings Also, concerning duplicate file names: In the first example there are 4 duplicate file names with the lawyers name. (would this cause Google to not index some) In the second example there are all unique file names (would this look spammy to Google or the user) In the third example there are millions of duplicate file names (if 1 million lawyers then 1 million files called "office-locations" etc (could so many duplicate filenames cause ranking issues) Should the lawyers name (which is the main keyword target) appear in the filename or in the folder - which is better for SEO in your opinion? Thanks for your input!
Intermediate & Advanced SEO | | irvingw0 -
Site Architecture: Cross Linking vs. Siloing
I'm curious to know what other mozzers think about silo's... Can we first all agree that a flat site architecture is the best practice? Relevant pages should be grouped together. Shorter, broader and (usually) therefore higher volume keywords should be towards the top of each category. Navigation should flow from general to specific. Agreed? As Google say's on page 10 of their SEO Starter Guide, "you should think about how visitors will go from a general page (your root page) to a page containing more specific content ." OK, we all agree so far, right? Great! Enter my question: Bruce Clay (among others) seem to recommend siloing as a best practice. While Richard Baxter (and many others @ SEOmoz), seem to view silos as a problem. Me? I've practiced (relevant) internal cross linking, and have intentionally avoided siloing in almost all cases. What about you? Is there a time and place to use silos? If so, when and where? If not, how do we rectify the seemingly huge differences of opinions between expert folks such as Baxter and Clay?
Intermediate & Advanced SEO | | DonnieCooper7