What do you think of Theme pyramids for SEO?
-
Hi,
Just been reading up on theme pyramids, I have seen these before but found a good article on the subject going into quite some detail.
http://www.canonicalseo.com/theme-pyramids/
Using the word 'Pyramid' does scream black hat to me but looking at the structure, this must be the best way for internal linking.
Even the keyword structure looks good,
Example:
homepage - shoes
category - red shoes
sub category - size 7 red shoes
Building anchor text links for shoes, red shoes or size 7 red shoes will benefit all 3 terms.
Negative/Positive comments please.
-
I think this video would explain a lot
http://www.seomoz.org/blog/whiteboard-friday-flat-site-architecture
-
Ok, let me get this right so I don't have to ask again
Example Layout:
http://www.seoconsult.com/seoblog/wp-content/uploads/2011/05/Flat-Architecture1.jpg
If pages on level 3 only linked back to it's linking page on level 2 that would be pyramid.
If every page linked to every page then that is flat.
Usually in eCommerce websites you cant get to a individual product in one click from the homepage, you have to travel through cats and sub cats so most eCommerce sites must not be flat?
-
No flat means that evey page is linked from the home page and back to the home page, this is the optimal link stucture for pagerank. But as mentioned , you need to think of your users.
If you make every page link to every other page, then you keep thinks pretty equal, when really you want certain pages to have more pagerank then others.
Make sure you important pages are linked from your homepage if posible, then from there, i would think of what needed for your users.
Every ones content is different, and there is no hard and fast rule that fits every site.
-
Yes, every site potentially has a logical hierarchy to it (more than one, in most cases) that could make sense for both visitors and SEO. It's really the basis of all information architecture, in a sense.
In SEO, we usually refer to a "flat" architecture as an ideal where the home-page would link to every page on the site and every page would only be one step away. Of course, in practice, this can lead to unusable sites and massive dilution of internal PR. It's great for a 10-page site, but not for a 10,000-page site.
-
Thanks for your reply,
Ok Peter, so your actually saying the 'idea' of the pyramid does make logical sense.
Every website has a hierarchy and this can be produced for bots and users by introducing some kind of pyramid linking structure?
When you say 'flat architecture' I take it you mean where every page links to every page so every page looks equal?
-
Absolutely (I actually thumbed up your comments). It's good to be aware of internal PR flow, and it IS important. it's just easy to go crazy.
-
I agree, i do not suggest anyone go out to link for the sole reason of pagerank (Admittedly I did just that when I first read the algorithm), only that is should be understood and considered when linking.
It is amazing how good a linking structure you come up with when you link naturaly. in fact i have to say that a lot of good SEO occures when doing things naturaly. I think SEO today is more of what not to do rather than what to do. It is hard to beat the SE's but you can make sure you not doing yourself harm.
-
I think this is really just an extension of site/information architecture in general - to some degree, a logical structure is good for people and bots. I also think there's no "right" answer when it comes to this kind of structure vs. a "flat" architecture. As Alan said, a flat architecture isn't usually practical on big sites, but I think it goes deeper. A flat architecture implies that all the pages on your site have equal weight. That's rarely true. Driving internal link-juice to major categories and drilling down focuses the most weight on the top.
Now, you can overdo it. I think the article you site goes a little too far these days, because if you apply that to any situation, you're going to end up with a ton of thin content. Post-Panda, created 100s of deep pages just to target 3-4 word phrases could backfire. Eventually, you're going to run out of content for those pages. So, I wouldn't create a pyramid frame and then start looking for bricks. Start with your pile of bricks and see what kind of pyramid you can make out of it. Good information architecture starts with the information you have.
I also tend to lean toward hybrid approaches. For example, you can set up a pyramid but then also link to your Top 10 Products from your home-page. That flattens your architecture for those key products and sends link-juice deep into your structure. There are a lot of useful variations on that theme.
-
no, no-follow tags use as much pagerank as any other links, it just does go anywhere, so it is wasted. You should never use no-follow on internal links, is just a waste of link juice.
No-follows produce no gain, the only use is when you link to a dodgy site, you want to tell Google, thaat you are not passing link juice to this dogy site, you do not vouch for them.
-
Thanks,
But can this not be controlled by the no follow tag?
-
If you could you would want to link to every page from your home page and link back to the home page from every page without any other linksing, but a limit on the number of pages http://thatsit.com.au/seo/reports/violation/the-page-contains-too-many-hyperlinks andthe fact that it may not be friendly to your users does not allow this, but this is the best linking stucture, have a read of this page for a clearer explaintion http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO - All topic related pages in same directory?
Hey Mozzers, How would you structure the following pages for SEO. The site is a multi-product / multi-topic site, but all pages in this example are based on a single topic - CRM Software: CRM Software product CRM Software related blog post 1 CRM Software related blog post 2 CRM Software related blog post 3 CRM Software downloadable resource 1 CRM Software downloadable resource 2 CRM Software downloadable resource 3 I know building directory pyramids is a bit old hat nowadays, but I still see the odd website organising the above pages, as follows: /crm-software /crm-software/crm-blog-post-1 /crm-software/crm-blog-post-2 /crm-software/crm-blog-post-3 /crm-software/crm-resource-1 /crm-software/crm-resource-2 /crm-software/crm-resource-3 However, I'm more inclined to apply a more logical structure, as follows: /crm-software /blog/crm-blog-post-1 /blog/crm-blog-post-2 /blog/crm-blog-post-3 /resources/crm-resource-1 /resources/crm-resource-2 /resources/crm-resource-3 What would you say is SEO best practice? Thanks!
White Hat / Black Hat SEO | | Zoope0 -
Is this negative SEO? Should I disavow these links?
We have been doing our own internal link building for the last year and getting nice backlinks. As of the last few days, ahrefs is showing a lot of new links that seem very spammy. We have not hired anyone to do link building for us, and these are all being created on these sites under the same user name. There is a good amount of them popping up, and I fear we will be subjected to a google pentalty for unnatural links if its not addressed. My first question is, am I correct thinking this is negative seo, and not some random sites that picked up our content and is going across their affiliate websites? If so, then should I preemptively disavow all these links? Are there any good ways to stop this? How can I track who is placing these garbage links? Here are some examples of these bad links. I know I can find the webmaster via a whois but I think that really wont get me anywhere, but I could be wrong. Here are some examples of the links that started popping up yesterday and today. http://pligg-cms.info/story.php?title=student-loan-debt-relief
White Hat / Black Hat SEO | | DemiGR
http://www.sharklinks.info/story.php?title=-student-loan-consolidation-options
http://factson37.com/story.php?title=student-loan-debt-forgiveness-website
http://social-marker.info/story.php?title=-student-loan-debt-forgiveness
http://makingbookmarks.info/story.php?title=-student-loan-consolidation-options
http://bookmarkingforseo.com/story.php?title=top-student-loan-consolidation-options
http://jadelinks.info/story.php?title=-student-loan-consolidation-options There are quite a bit more and they don't seem to be stopping. All of them look pretty much identical to this. Thoughts?1 -
Looking For SEO expert
We are looking for very competent and expert to handle the SEO for a plastic surgery clinic in Toronto Canada. Does anyone knows who are the best people in that field.. I am looking for the best of the best .. any suggestions or recommendations? Thank you
White Hat / Black Hat SEO | | SinaKashani0 -
Do pingbacks in Wordpress help or harm SEO? Or neither?
Hey everyone, Just wondering, do pingbacks in Wordpress help or harm SEO? Or neither?
White Hat / Black Hat SEO | | jhinchcliffe1 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
EXPERT CHALLENGE: What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change?
FOR ALL SEO THOUGHT LEADERS...What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change? NOTE: My hope is that the responses left on this thread will ultimately benefit all members of the community and give recognition to the true thought leaders within the SEO space. That being said, my challenge is a 2 part question: With the 80/20 rule in mind, and in light of recent algorithm changes, what would YOU focus most of your SEO budget on if you had to choose? Let's assume you're in a competitive market (ie #1-5 on page 1 has competitors with 20,000+ backlinks - all ranging from AC Rank 7 to 1). How would you split your total monthly SEO budget as a general rule? Ex) 60% link building / 10% onsite SEO / 10% Social Media / 20% content creation? I realize there are many "it depends" factors but please humor us anyways. Link building appears to have become harder and harder as google releases more and more algorithm changes. For link building, the only true white hat way of proactively generating links (that I know of) is creating high quality content that adds value to customers (ie infographics, videos, etc.), guest blogging, and Press Releases. The con to these tactics is that you are waiting for others to find and pick up your content which can take a VERY long time, so ROI is difficult to measure and justify to clients or C-level management. That being said, how are YOU allocating your link building budget? Are all of these proactive link building tactics a waste of time now? I've heard it couldn't hurt to still do some of these, but what are your thoughts and what is / isn't working for you? Here they are: A. Using spun articles edited by US based writers for guest blog content B. 301 Redirects C. Social bookmarking D. Signature links from Blog commenting E. Directory submissions F. Video Submissions G. Article Directory submissions H. Press release directory submissions I. Forum Profile Submissions J. Forum signature links K. RSS Feed submissions L. Link wheels M. Building links (using scrapebox, senukex, etc.) to pages linked to your money site N. Links from privately owned networks (I spoke to an SEO company that claims to have over 4000 unique domains which he uses to boost rankings for his clients) O. Buying Contextual Text Links All Expert opinions are welcomed and appreciated 🙂
White Hat / Black Hat SEO | | seoeric2 -
How much time do you think Google employees spend reverse engineering what we do?
Lets face it, it's the corner stone of SEO, reverse engineering sites to guess at what big G does. It would just make sense they did the same to learn all our tactics.
White Hat / Black Hat SEO | | naffhampton1