Silo vs breadcrumbs in 2015
-
Hi ive heard silos being mentioned in the past to help with rankings does this still apply?
and what about breadcrumbs do i use them with the silo technique or instead of which ones do you think are better or should i not be using these anymore with the recent google updates?
-
great thanks ill give that a go
-
It's been a while since I've used WP, but if you use posts (or posts and pages), you will have a major silo and duplicate content problem with blog category pages.
The way to solve this is to go to the section where you set up your post categories, and set the slug to be identical to your category page. For example, if you have a page category with the slug "blue-widgets", set the post category slug to "blue widgets". This makes the category page the parent for posts in that category.
There are also some adjustments that you will need to make to your URLs removing "/category/ from your URLs. I've done it, and it's pretty easy. Maybe another poster could give you the specifics.
-
great thanks very informative reply, i've started using wordpress for most of my sites now, is siloing easy enough to do in wordpress?
-
Silos will always work. It's not some trick - it's how Google works. Here's a very simplified explanation as to why...
Let's say that I have an eCommerce site, and I sell lawnmowers and Plywood. Let's also say that the Lawnmowers category page has a theoretical 100 points of link juice. Lets also say that the site sells 2 lawnmowers - the Fubar 2000 and the Toecutter 300. If the lawnmower category page only links to the Fubar 2000 and the Toecutter 300 pages, the category page will push 45 points of link juice to each page (pages can pass on +/-90% of their link juice, and 90/2=45).
Both pages will receive almost the full 45 point benefit because the pages are relevant to the category page.
If the Lawnmower category page instead only has 1 link to the Plywood page, the Lawnmower category page would push 90 points of link juice to the plywood page. But, the Plywood page would not receive the full benefit of the 90 points, because Lawnmowers and Plywood don't share much relevance. In this case, Google would heavily discount the 90 points, so that the Plywood page might only get the benefit of 30 points. Think of it as a leaky hose.
What happens to the other 60 Points of Link Juice? It gets dumped on the floor, and the site loses the ranking power of those 60 points.
Keep in mind that this is all theoretical, and that link juice comes in different flavors like apple, orange and prune, representing the different ranking factors (Trust, Authority, Topical Authority, Social Signals, etc.) . Orange might discount 90% while prune might only discount 10%. In this case, is there really a 67% link juice hit? Damned if I know, but I had to pick a number... This is all theoretical. I do know that link juice loss between pages that aren't relevant is dramatic. I also know that it is very possible to determine how your internal pages rank based on your internal link structure, and link placement on the page.
By siloing a website, I have seen rankings jump dramatically. Most websites hemorrhage link juice. Think of it as Link Juice Reclamation. The tighter you can build your silos, the less link juice gets dumped on the floor. By reclaiming the spilled link juice and putting it in the right places, you can dramatically increase your rankings. BTW, inbound links work in a similar fashion. If the Lawnmower page was an external site and linked to the Plywood page, the same discounts would apply. That's why it pays to get niche relevant backlinks for maximum benefit.
This in no way accounts for usability, and linking between silos can make sense to benefit end-users. Again, this model is probably overly simplified, and doesn't take into account Block Level Analysis, but the logic is sound. You can build spreadsheet models for link juice distribution factoring in Block level, discounts, etc. It's by no means accurate, but can give you a pretty good idea of where your link juice is going. You can model this on the old (and increasingly irrelevant) PageRank Algorithm. Pagerank is Logarithmic and it takes 8-9x as much link juice to move up in PR. If it takes 100 points of Link Juice to become a PR1, it takes 800-900 points to become a PR 2. Generally speaking a PR2 page, via links, can create roughly 7 to 75 PR1 pages, depending on how close the PR2 is to becoming a PR3.
-
Both is the way to go. Silos are essentially structuring your pages so that per topic, there is 1 master article and multiple supporting articles that link back to the master article. The topic only links to pages relevant to the topic and not other sections of the site.
You can use breadcrumbs in conjunction with a silo as the structure is suitable for them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How To Implement Breadcrumbs
Hi, I'm looking to implement breadcrumbs for e-commerce store so they will appear in the SERP results like the attached image. In terms of implementing to a site, would you simply add HTML to each page like this Google example? Which looks like this: Books › Science Fiction Award Winners Then is there anything you need to do, to get this showing in the SERPs results e.g. doing something in search console. Or do you just wait into google has crawled and hopefully starts showing in the SERPs results? Cheers. wn3ybMMOQFW98fNQkxtJkA.png [SERP results with bread crumbs](SERP results with bread crumbs)
Intermediate & Advanced SEO | | jaynamarino0 -
Breadcrumbs not displaying on Google
Hello, We have set breadcrumbs on some of our pages (example: https://www.globecar.com/en/car-rental/locations/canada/qc/montreal/airport-yul) for testing purposes and for some reasons they are still not showing up on Google: http://screencast.com/t/BSHQqkP69r6F Yet when I test the page with Google Structured Data Testing tool all is good: http://screencast.com/t/Fzlz3zae Any ideas? Thanks, Karim
Intermediate & Advanced SEO | | GlobeCar0 -
Link cloaking in 2015\. Is it a bad idea now?
Hi everyone, I run a travel-related website and work with various affiliate partners. We have thousands of pages of well-written and helpful content, and many of these pages link off to one of our affiliates for booking purposes. Years ago I followed the prevailing wisdom and cloaked those links (bouncing them into a folder that was blocked in the robots.txt file, then redirecting them off to the affiliate). Basically, doing as Yoast has written: https://yoast.com/cloak-affiliate-links/ However, that seems kind of spammy and manipulative these days. Doesn't Google talk about not trying to manipulate links and redirect users? Could I just "nofollow" these links instead and drop the whole redirect charade? Could cloaking actually work against you? Thoughts? Thanks.
Intermediate & Advanced SEO | | TomNYC0 -
Removal tool - no option to choose mobile vs desktop. Why?
Google's removal tool doesn't give a person the option to tell them which index - mobile friendly, or desktop/laptop - the url should be removed from. Why? I may have a fundamental misunderstanding. The way I thought it works is that when you have a dynamically generated page based on the user agent, (ie, the SAME URL but different formatting for smartphones as for desktop/laptop) then the Google mobile bot will index the mobile friendly version and the desktop bot will index the desktop version -- so Google will have 2 different indexed results for the same url. That SEEMS to be validated by the existence of the words 'mobile-friendly' next to some of my mobile friendly page descriptions on mobile devices. HOWEVER, if that's how it works--why would Google not allow a person to remove one of the urls and keep the other? Is it because Google thinks a mobile version of a website must have all of the identical pages as the desktop version? What if it doesnt? What if a website is designed so that some of the slower pages simply aren't given a mobile version? Is it possible that Google doesn't really save results for a mobile friendly page if there is a corresponding desktop page-- but only checks to see if it renders ok? That is, it keeps only one indexed copy of each url, and basically assumes the mobile title and actual content is the same and only the formatting is different? That assumption isn't always true -- mobile devices lend themselves to different interactions with the user - but it certainly could save Google billions of dollars in storage. Thoughts?
Intermediate & Advanced SEO | | friendoffood0 -
Cross Domain Rel Canonical tags vs. Rel Canonical Tags for internal webpages
Today I noticed that one of my colleagues was pointing rel canonical tags to a third party domain on a few specific pages on a client's website. This was a standard rel canonical tag that was written Up to this point I haven't seen too many webmasters point a rel canonical to a third party domain. However after doing some reading in the Google Webmaster Tools blog I realized that cross domain rel canonicals are indeed a viable strategy to avoid duplicate content. My question is this; should rel canonical tags be written the same way when dealing with internal duplicate content vs. external duplicate content? Would a rel=author tag be more appropriate when addressing 3rd party website duplicate content issues? Any feedback would be appreciated.
Intermediate & Advanced SEO | | VanguardCommunications0 -
Competitor sites vs mine - No links, lower DA, and still beating me.
Thank you for taking the time to read my question. I have a website - berneseoftherockies.com - it is a bernese mountain dog website My competitors are Rockymountainpuppies(dot)com and Coloradobernesemountaindog(dot)com When using the Moz tools, I see they have no incoming links, except for one site has 5 links from its own pages. But when I type in Bernese Mountain Dogs Colorado - I am no where to be found, except for a you tube video. So what am I doing so wrong? They are basically doing nothing, and killing me in the serps. I have gotten social media stuff like Google +, facebook, twitter, pinterest, and youtube. They are still behind the times. So any thoughtful advice is appreciated. I mainly cater to the state of Colorado where I live. So just curious if there is something at the top of your head that you may think of that's causing my issues? Like could it be my hosting? Like can you have a black listed host? I am with Hostdime I did have a few, like 10 foreign backlinks, which I did remove or disavow I think its called. I have used the title tag tools here to get proper size title tags, and decent keyword density. I built the site for people first, then Google etc. So not sure if you are allowed to tell me, but maybe you can advise me on a decent seo company, or maybe give me a couple tips that may help me out. Please no - read the moz book, I am reading it and trying to do what I am reading. But maybe something simple is keeping me from showing up, while these other sites are. Thank you so much for any advice.
Intermediate & Advanced SEO | | Berner0 -
SEO implications of serving a different site on HTTPS vs. HTTP
I have two sites: Site A, and Site B. Both sites are hosted on the same IP address, and server using IIS 7.5. Site B has an SSL cert, and Site A does not. It has recently been brought to my attention that when requesting the HTTPS version of Site A (the site w/o an SSL cert), IIS will serve Site B... Our server has been configured this way for roughly a year. We don't do any promotion of Site A using HTTPS URLs, though I suppose somebody could accidentally link to or type in HTTPS and get the wrong website. Until we can upgrade to IIS8 / Windows Server 2012 to support SNI, it seems I have two reasonable options: Move Site B over to its own dedicated IP, and let HTTPS requests for Site A 404. Get another certificate for Site A, and have it's HTTPS version 301 redirect to HTTP/non-ssl. #1 seems preferable, as we don't really need an SSL cert for Site A, and HTTPS doesn't really have any SEO benefits over HTTP/non-ssl. However, I'm concerned if we've done any SEO damage to Site A by letting our configuration sit this way for so long. I could see Googlebot trying https versions of websites to test if they exist, even if there aren't any ssl/https links for the given domain in the wild... In which case, option #2 would seem to mostly reverse any damage done (if any). Though Site A seems to be indexed fine. No concerns other than my gut. Does anybody have any recommendations? Thanks!
Intermediate & Advanced SEO | | dsbud0 -
Sitelinks (breadcrumbs) in SERPs
Hi there, I have a .co.uk & .ie website both have the exact same content, only differences is the UK website is selling the product in pounds and the Irish website is selling in Euros plus both websites have different contact numbers. I decided to use rel canonical on the .ie pointing to the .co.uk website as I think it was having an issue in my SERPs for the .co.uk website in Google.co.uk, anyway since doing this, I am seeing strange things happening in SERPs for my keywords, for example if you click the link below, my website is number 2 for 'hot flushes' if you hover over or click on 'health or 'menopause' in the breadcrumbs in SERPs it takes you to the .co.uk website, is this normal? Click here
Intermediate & Advanced SEO | | Paul780