Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we鈥檙e not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Flat vs. Silo Site Architecture, What's Better
-
I'm in the midst of converting a fairly large website (500+ pages) into WordPress as a content management system. I know that there are two schools of thought regarding site architecture:
-
Those who believe that everything should be categorized, I.E.- website.com/shoes/reebok/running
-
People who believe that the less clicks it takes from the homepage the better.
As it stands, our current site has a completely flat architecture, with landing pages being added randomly to the root, I.E.- website.com/affordable-shoes-in-louisville-ky
I'm beginning to think that there is a gray area with this. I spoke to someone who says that you should never have a page more than 2 categories/subfolders deep. But if we plan on adding a lot of content doesn't it make sense to set the site up into many categories so we can set a good foundation for adding massive amounts of content.
Also, will 301 redirecting to the new structure cause us to lose rankings for certain terms?
Any help here is appreciated.
-
-
We've had very good results by silo'ing our content. We do use drop down menus. We are ranking very high (top three) for some targeted key phrases that are over two categories deep. I'm a big fan of the silo approach.
-
I am inclined to lead toward some type of siloing with a high content site. There is the very purest silo architecture which I feel Bruce Clay presents very clearly in his site articles. You can certainly vary it to be less rigid and still be an effective SEO tool.
I generally agree that MOST content should not be too many clicks from the home page, but drop down menus can go a long way to keep a lot of content close without it being unwieldy. Perhaps it will help to look at it this way: the way your structure your navigation tells Google what you believe your most important pages are - if you tell them ALL your pages are equally important, you dilute the ability of your top pages to rank better than your lesser pages.
If that makes sense to you, I hope it helps.
301 redirects are the very best way to redirect and retain the most link power. Within the site, you have nothing to worry about if your new structure has better SEO. 301 redirects do not always pass 100% of external bank link juice, but it's still the best tool we have to keep what we have already achieved.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is having a site map page necessary?
Hello all! So I know having a sitemap XML file is important to include in your robots.txt file. I also know it is important to submit your XML sitemap to Google and Bing. However, I am wondering if it is beneficial for your site's SEO value to have a sitemap page displayed on your website? Or is this just a redundant action if you have already done the above two actions with your XML sitemap? Thanks in advance!
Web Design | Oct 5, 2019, 8:23 AM | Myles920 -
Is it against google guidelines to use third party review sites as well as have reviews on my site marked up with schema?
So, i look after a site for my family business. We have teamed up with the third party site TrustPilot because we like the way it enables us to send out reviews to our customers directly from our system. It's been going great and some of the reviews have been brilliant. I have used a couple of these reviews on our site and marked them up with: REVIEW CONTENT We work in the service industry and so one of the problems we have found is that getting our customers to actually go online and leave a review. They normally just leave their comments on a job sheet that the workers have signed when they leave. So I have created a page on our site where we post some of the reviews the guys receive too. I have used the following: REVIEW TITLE REVIEW Written by: CUSTOMER NAME Type of Service:House Removal Date published: DATE PUBLISHED 10 / 10 stars I was just wondering I was told that this could be against googles guidelines and as i've seen a bit of a drop in our rankings in the last week or so i'm a little concerned. Is this getting me penalised? Should I not use my reviews referencing the ones on trust pilot and should i not have my own reviews page with rich snippets?
Web Design | Apr 5, 2016, 11:40 AM | BearPaw881 -
Anyone using CloudFlare on multiple sites?
We are considering using CloudFlare as a CDN for a large group of sites. The fees are $5 to $200 depending on many factors. We tried the free trial on one site and were impressed with the results. I am wondering if any of you have any longer term experience with this and performance metrics, etc.
Web Design | Nov 19, 2016, 3:20 PM | RobertFisher1 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate 2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
Web Design | Jul 3, 2014, 5:09 PM | Kingalan1
15 PAGES Low bounce rate. 3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate 4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES 5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | Mar 4, 2014, 4:19 PM | danatanseo0 -
Side Nav. Vs. Top Nav
I have a client that currently has a side navigation and wants to know how changing to a top nav will affect her SEO. We always recommend top nav for user experience but I am not sure if there is a direct effect on SEO. Would the change affect it? Thoughts?
Web Design | Jan 11, 2013, 12:14 AM | hwade0 -
Infinite Scrolling vs. Pagination on an eCommerce Site
My company is looking at replacing our ecommerce site's paginated browsing with a Javascript infinite scroll function for when customers view internal search results--and possibly when they browse product categories also. Because our internal linking structure isn't very robust, I'm concerned that removing the pagination will make it harder to get the individual product pages to rank in the SERPs. We have over 5,000 products, and most of them are internally linked to from the browsing results pages in the category structure: e.g. Blue Widgets, Widgets Under $250, etc. I'm not too worried about removing pagination from the internal search results pages, but I'm concerned that doing the same for these category pages will result in de-linking the thousands of product pages that show up later in the browsing results and therefore won't be crawlable as internal links by the Googlebot. Does anyone have any ideas on what to do here? I'm already arguing against the infinite scroll, but we're a fairly design-driven company and any ammunition or alternatives would really help. For example, would serving a different page to the Googlebot in this case be a dangerous form of cloaking? (If the only difference is the presence of the pagination links.) Or is there any way to make rel=next and rel=prev tags work with infinite scrolling?
Web Design | Oct 18, 2018, 6:06 AM | DownPour0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: 帽 and 贸. We have done our research around the web and realised that many of the top competitors for keywords such as Dise帽o Web (web design) and Aplicai贸n iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Dise帽o-Web/Dise帽o-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicaci贸n-iPhone/Aplicaci贸n-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicaci贸n-iPhone/Aplicaci贸n-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | Oct 27, 2011, 8:40 AM | wdziedzic0