What do you think of Theme pyramids for SEO?
-
Hi,
Just been reading up on theme pyramids, I have seen these before but found a good article on the subject going into quite some detail.
http://www.canonicalseo.com/theme-pyramids/
Using the word 'Pyramid' does scream black hat to me but looking at the structure, this must be the best way for internal linking.
Even the keyword structure looks good,
Example:
homepage - shoes
category - red shoes
sub category - size 7 red shoes
Building anchor text links for shoes, red shoes or size 7 red shoes will benefit all 3 terms.
Negative/Positive comments please.
-
I think this video would explain a lot
http://www.seomoz.org/blog/whiteboard-friday-flat-site-architecture
-
Ok, let me get this right so I don't have to ask again
Example Layout:
http://www.seoconsult.com/seoblog/wp-content/uploads/2011/05/Flat-Architecture1.jpg
If pages on level 3 only linked back to it's linking page on level 2 that would be pyramid.
If every page linked to every page then that is flat.
Usually in eCommerce websites you cant get to a individual product in one click from the homepage, you have to travel through cats and sub cats so most eCommerce sites must not be flat?
-
No flat means that evey page is linked from the home page and back to the home page, this is the optimal link stucture for pagerank. But as mentioned , you need to think of your users.
If you make every page link to every other page, then you keep thinks pretty equal, when really you want certain pages to have more pagerank then others.
Make sure you important pages are linked from your homepage if posible, then from there, i would think of what needed for your users.
Every ones content is different, and there is no hard and fast rule that fits every site.
-
Yes, every site potentially has a logical hierarchy to it (more than one, in most cases) that could make sense for both visitors and SEO. It's really the basis of all information architecture, in a sense.
In SEO, we usually refer to a "flat" architecture as an ideal where the home-page would link to every page on the site and every page would only be one step away. Of course, in practice, this can lead to unusable sites and massive dilution of internal PR. It's great for a 10-page site, but not for a 10,000-page site.
-
Thanks for your reply,
Ok Peter, so your actually saying the 'idea' of the pyramid does make logical sense.
Every website has a hierarchy and this can be produced for bots and users by introducing some kind of pyramid linking structure?
When you say 'flat architecture' I take it you mean where every page links to every page so every page looks equal?
-
Absolutely (I actually thumbed up your comments). It's good to be aware of internal PR flow, and it IS important. it's just easy to go crazy.
-
I agree, i do not suggest anyone go out to link for the sole reason of pagerank (Admittedly I did just that when I first read the algorithm), only that is should be understood and considered when linking.
It is amazing how good a linking structure you come up with when you link naturaly. in fact i have to say that a lot of good SEO occures when doing things naturaly. I think SEO today is more of what not to do rather than what to do. It is hard to beat the SE's but you can make sure you not doing yourself harm.
-
I think this is really just an extension of site/information architecture in general - to some degree, a logical structure is good for people and bots. I also think there's no "right" answer when it comes to this kind of structure vs. a "flat" architecture. As Alan said, a flat architecture isn't usually practical on big sites, but I think it goes deeper. A flat architecture implies that all the pages on your site have equal weight. That's rarely true. Driving internal link-juice to major categories and drilling down focuses the most weight on the top.
Now, you can overdo it. I think the article you site goes a little too far these days, because if you apply that to any situation, you're going to end up with a ton of thin content. Post-Panda, created 100s of deep pages just to target 3-4 word phrases could backfire. Eventually, you're going to run out of content for those pages. So, I wouldn't create a pyramid frame and then start looking for bricks. Start with your pile of bricks and see what kind of pyramid you can make out of it. Good information architecture starts with the information you have.
I also tend to lean toward hybrid approaches. For example, you can set up a pyramid but then also link to your Top 10 Products from your home-page. That flattens your architecture for those key products and sends link-juice deep into your structure. There are a lot of useful variations on that theme.
-
no, no-follow tags use as much pagerank as any other links, it just does go anywhere, so it is wasted. You should never use no-follow on internal links, is just a waste of link juice.
No-follows produce no gain, the only use is when you link to a dodgy site, you want to tell Google, thaat you are not passing link juice to this dogy site, you do not vouch for them.
-
Thanks,
But can this not be controlled by the no follow tag?
-
If you could you would want to link to every page from your home page and link back to the home page from every page without any other linksing, but a limit on the number of pages http://thatsit.com.au/seo/reports/violation/the-page-contains-too-many-hyperlinks andthe fact that it may not be friendly to your users does not allow this, but this is the best linking stucture, have a read of this page for a clearer explaintion http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What sort of knowledge differentiates a good SEO from a great SEO?
This is definitely more of a discussion than a clear cut answer.
White Hat / Black Hat SEO | | Edward_Sturm0 -
Looking For SEO expert
We are looking for very competent and expert to handle the SEO for a plastic surgery clinic in Toronto Canada. Does anyone knows who are the best people in that field.. I am looking for the best of the best .. any suggestions or recommendations? Thank you
White Hat / Black Hat SEO | | SinaKashani0 -
Is Yahoo! Directory still a beneficial SEO tactic
For obvious reasons, we have submitted our clients to high authority directories such as Yahoo! Directory and Business.com. However, with all of the algorithm updates lately, we've tried to cut back on the paid directories that we submit our clients to. Having said that, my question is, is Yahoo! Directory still a beneficial SEO tactic? Or are paid directories, with the exception of BBB.com, a bad SEO tactic?
White Hat / Black Hat SEO | | MountainMedia0 -
Redirecting old domains for SEO ranking?
It's been a while since I read anything seriously out of the box on SEO but I thought I would see what others thought of the bold assertions made in this article. Most of it revolves around buying expired domains and using a 301 to point them, and their juice, to new sites. This guy makes a living doing this so he has to know a bit more than the average Joe but I'm wondering where the other shoe is and when it drops.
White Hat / Black Hat SEO | | Highland0 -
Press Releases and SEO in 2013
Mozers, A few questions for the community: Distributing a press release through a service like 24-7pressrelease.com - is it a serious duplicate content issue when an identical press release is distributed to multiple sites with no canonical markup (as far as I can tell)? All of the backlinks in the press release are either nofollow or redirects. If there IS a duplicate content issue, will the website be affected negatively given the numerous Panda and Penguin refreshes? Why SHOULDN'T a company issue a press release to multiple sites if it actually has something legitimate to announce and the readership of a given site is the target demographic? For example, why shouldn't a company that manufactures nutritional health supplements issue the same press release to Healthy Living, Lifestyle, Health News, etc _with a link to the site?_I understand it's a method that can be exploited for SEO purposes, but can't all SEO methods be taken to an extreme? Seems to me that if this press release scenario triggers the duplicate content and/or link spam penalty(ies), I'd consider it a slight deficiency of Google's search algorithm. Any insight would be much appreciated. Thanks.
White Hat / Black Hat SEO | | b40040400 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Is our sub-domain messing up our seo for our root?
We have a website (mysite.com) that we control and a subdomain (affiliate.mysite.com) that is 3rd-Party Content completely out of our control. I've found that nearly all or our Crawl Errors are coming from this subdomain. Same deal with 95% of our warnings: they all come from the subdomain. The two website are very much interlinked, as the subdomain serves up the header and footer of the root domain through iFrames and the 3rd-party content in the middle-section. On the root domain there are countless links pointing at this 3rd-party subdomain. How do these errors affect the root domain, and how do you propose we address the issue?
White Hat / Black Hat SEO | | opusbyseo0 -
Got an SEO package, paid $400+ for it, basically got scammed.
Hi guys, I know this is stupid but I bought an SEO package for around $400. Received the report, and my... it was a complete load of spam. It was basically a blast to lots of sites with random articles and my anchor texts all over the place. Theres thousands of these links and the articles dont make sense, I'm not sure what i'm going to do! This is my main Ecommerce website and i'm worried, i've complained and I hope to get a refund however i'm worried hes going to just blast my site and get me penalized by Google. It is clearly blackhat. Is there anything I can do? I'm very worried. Thanks
White Hat / Black Hat SEO | | Superinks0