Silo vs breadcrumbs in 2015
-
Hi ive heard silos being mentioned in the past to help with rankings does this still apply?
and what about breadcrumbs do i use them with the silo technique or instead of which ones do you think are better or should i not be using these anymore with the recent google updates?
-
great thanks ill give that a go
-
It's been a while since I've used WP, but if you use posts (or posts and pages), you will have a major silo and duplicate content problem with blog category pages.
The way to solve this is to go to the section where you set up your post categories, and set the slug to be identical to your category page. For example, if you have a page category with the slug "blue-widgets", set the post category slug to "blue widgets". This makes the category page the parent for posts in that category.
There are also some adjustments that you will need to make to your URLs removing "/category/ from your URLs. I've done it, and it's pretty easy. Maybe another poster could give you the specifics.
-
great thanks very informative reply, i've started using wordpress for most of my sites now, is siloing easy enough to do in wordpress?
-
Silos will always work. It's not some trick - it's how Google works. Here's a very simplified explanation as to why...
Let's say that I have an eCommerce site, and I sell lawnmowers and Plywood. Let's also say that the Lawnmowers category page has a theoretical 100 points of link juice. Lets also say that the site sells 2 lawnmowers - the Fubar 2000 and the Toecutter 300. If the lawnmower category page only links to the Fubar 2000 and the Toecutter 300 pages, the category page will push 45 points of link juice to each page (pages can pass on +/-90% of their link juice, and 90/2=45).
Both pages will receive almost the full 45 point benefit because the pages are relevant to the category page.
If the Lawnmower category page instead only has 1 link to the Plywood page, the Lawnmower category page would push 90 points of link juice to the plywood page. But, the Plywood page would not receive the full benefit of the 90 points, because Lawnmowers and Plywood don't share much relevance. In this case, Google would heavily discount the 90 points, so that the Plywood page might only get the benefit of 30 points. Think of it as a leaky hose.
What happens to the other 60 Points of Link Juice? It gets dumped on the floor, and the site loses the ranking power of those 60 points.
Keep in mind that this is all theoretical, and that link juice comes in different flavors like apple, orange and prune, representing the different ranking factors (Trust, Authority, Topical Authority, Social Signals, etc.) . Orange might discount 90% while prune might only discount 10%. In this case, is there really a 67% link juice hit? Damned if I know, but I had to pick a number... This is all theoretical. I do know that link juice loss between pages that aren't relevant is dramatic. I also know that it is very possible to determine how your internal pages rank based on your internal link structure, and link placement on the page.
By siloing a website, I have seen rankings jump dramatically. Most websites hemorrhage link juice. Think of it as Link Juice Reclamation. The tighter you can build your silos, the less link juice gets dumped on the floor. By reclaiming the spilled link juice and putting it in the right places, you can dramatically increase your rankings. BTW, inbound links work in a similar fashion. If the Lawnmower page was an external site and linked to the Plywood page, the same discounts would apply. That's why it pays to get niche relevant backlinks for maximum benefit.
This in no way accounts for usability, and linking between silos can make sense to benefit end-users. Again, this model is probably overly simplified, and doesn't take into account Block Level Analysis, but the logic is sound. You can build spreadsheet models for link juice distribution factoring in Block level, discounts, etc. It's by no means accurate, but can give you a pretty good idea of where your link juice is going. You can model this on the old (and increasingly irrelevant) PageRank Algorithm. Pagerank is Logarithmic and it takes 8-9x as much link juice to move up in PR. If it takes 100 points of Link Juice to become a PR1, it takes 800-900 points to become a PR 2. Generally speaking a PR2 page, via links, can create roughly 7 to 75 PR1 pages, depending on how close the PR2 is to becoming a PR3.
-
Both is the way to go. Silos are essentially structuring your pages so that per topic, there is 1 master article and multiple supporting articles that link back to the master article. The topic only links to pages relevant to the topic and not other sections of the site.
You can use breadcrumbs in conjunction with a silo as the structure is suitable for them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain vs Subdirectory - does the content make a difference?
So I've read through all of the answers that suggest using a subdirectory is the best way to approach this - you rank more quickly and have all of your content on one site. BUT what if you're looking to move into a totally new market that your current site/content isn't in any way relevant to? Some examples are Supermarkets such as Tesco (who seem to use a mix of methods) http://www.tesco.com/groceries/, http://www.clothingattesco.com/, http://www.tesco.com/bank/ which links out from their main site to http://www.tescobank.com/ etc and Sainsburys http://www.sainsburys.co.uk/ who use subdomains - here they have their grocery offering, their bank offering, clothes, phones etc split into subdomains. If you have a product that is totally new to your Brand and different from all the products on your current site, does this change the answer to subdirectory vs subdomain? Would be great to hear your expert opinions on this. Thanks
Intermediate & Advanced SEO | | giffgaff2 -
Subdomains vs. Subfolders vs. New Site
Hello geniuses!!! Here's my Friday puzzle: We have a plastic surgery client who already has a website that's performing fairly well and is driving in leads. She is going to be offering a highly specialized skincare program for cancer patients, and wants a new logo, new website and new promo materials all for this new skincare program. So here's the thing - my gut reaction says NO NEW WEBSITE! NO SUBDOMAIN! because of everything I've read about moving things on and off subdomains, etc (I just studied this: http://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday). And, why wouldn't we want to use the authority of her current site, right? While she doesn't necessarily have a high authority domain - we're not talking WebMD, here - she does have some authority that we've built over time. But, because this is a pretty separate product from her general plastic surgery practice, what would you guys do? Since we'll be creating a logo and skincare "look and feel" for this product, and there will likely be a lot of information involved with it, I don't think we'll be able to just create one page. Is it smart to: a) build a separate site in a subfolder of her current site? (plasticsurgerypractice.com/skincare) b) build a subdomain? (skincare.plasticsurgerypractice.com) c) build her a new site (plasticsurgeryskincare.com)
Intermediate & Advanced SEO | | RachelEm0 -
Cookieless subdomains Vs SEO
We have one .com that has all our unique content and then 25 other ccltd sites that are translated versions of the .com for each country we operate in. They are not linked together but we have href lang'd it all together. We now want to serve up all static content of our global website (26 local country sites, .com, .co.uk, .se, etc) from one cookie-less subdomain. Benefit is speed improvement. The question is whether from an SEO perspective, can all static content come from static.domain.com or should we do one for each ccltd where it would come form static.domain.xx (where xx is localised to the domain in question)
Intermediate & Advanced SEO | | aires-fb770 -
Multi-Location SEO: Sites vs Pages
I just started with a new company that requires multi-location SEO for its niche product/service. Currently, we have a main corporate website, as well as, 40+ individual dealer websites (we host all). Keep in mind each of these dealers consist of only 1-2 people, so corporate I will be managing the site or sites and content strategy. Many of the individual dealer sites actually rank very well (#1-#3) in their areas for our targeted keywords, but they all use the same duplicate content. Also, there are many dealer sites that have dropped off the radar in last year, which is probably because of the duplicate and static content. So I'm at a crossroads... Attempt to redo all of these location sites with unique and local content for each or Create optimized unique pages for each of them on our main site and redirect their current local domains to their page on our site Any advise regarding which direction to go in and why. Why is very important. It will be very difficult to convince a dealer that is #1 with his local site that we are redirecting to our main site, so I need some good ammo and reasoning. Also, any tips toward achieving local seo success will be greatly appreciated, too! Thank you!
Intermediate & Advanced SEO | | the-coopersmith0 -
Sitelinks (breadcrumbs) in SERPs
Hi there, I have a .co.uk & .ie website both have the exact same content, only differences is the UK website is selling the product in pounds and the Irish website is selling in Euros plus both websites have different contact numbers. I decided to use rel canonical on the .ie pointing to the .co.uk website as I think it was having an issue in my SERPs for the .co.uk website in Google.co.uk, anyway since doing this, I am seeing strange things happening in SERPs for my keywords, for example if you click the link below, my website is number 2 for 'hot flushes' if you hover over or click on 'health or 'menopause' in the breadcrumbs in SERPs it takes you to the .co.uk website, is this normal? Click here
Intermediate & Advanced SEO | | Paul780 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
Maximum of 100 links on a page vs rel="nofollow"
All, I read within the SEOmoz blog that search engines consider 100 links on a page to be plenty, and we should try (where possible) to keep within the 100 limit. My question is; when a rel="nofollow" attribute is given to a link, does that link still count towards your maximum 100? Many thanks Guy
Intermediate & Advanced SEO | | Horizon0 -
WWW vs Non-WWW/Moving a site to a new CMS/Redirect all of the previous URLs
We are working on a new design for a website, which is currently on a CMS that has non-seo-friendly URLs. There is no redirection of 'www' to non-www or vice versa, or handling of homepage redirection so there is only one instance of 'home'. To move the site in the future, all of these URLs will have to be redirected to their new, and I hope, seo-friendly counterparts. Is it prudent now to redirect the four home page links so there is only one? and to redirect all non-www to 'www' so there is only one instance of each page? Or should I leave it and redirect all of them when the time comes?
Intermediate & Advanced SEO | | haan_seo0