What is the difference between a bunch of microsites and a link network?
-
Hello SEO community. I have started an online marketing company that focuses on a specific niche and have been researching how micro sites can be beneficial for SEO. For example the "Nifty" presentation mentioned how micro sites are going to be key for local seo.
However I have also heard that link networks are increasingly bad and are penalized by the Panda updated.
While we are writing good, original content for our clients, I like the microsites because:
- URL - we can choose urls for the main keywords
- Content Focus - we can focus on specific content
- Ranking - these sites seem to rank pretty well
- Citations - we are able to give citations for our clients from these sites
But am I worried, am I creating a link network? Even thought I am putting out useful, good content, is this more hurting me than helping me? Should I give up on this strategy or continue? Help!
-
I'd add the critical factor is "will this micro site stand on its own?" Meaning - is there enough unique content, value and trust for this site that it deserves to rank for its own phrases?
If the micro sites share most or all keywords or content as the main site, or if the quality of the micro site is poor, or if it is only filler content purely for liinking, there really is no difference between micro sites and a link network.
So if you want to go the route Charles describes (and I too have used with great success many times for many clients), be willing to achieve those goals. If you think you can "get away" with artificially faking it, you may get away with it for a while. However even if you do, all it takes is one manual review or an algorithm update to wipe out all of that work, and even possibly flag your site in a way that would require four times as much effort to recover.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
URL & Link Hierarchy - juice flow direction from backlinks?
Our site is very regional, so we focus all of our seo efforts on each of these region landing pages. For Example: domain.com/toys/us/ca/san-francisco We added an informational page (ex. reviews) and gave it a url like this: domain.com/toys/us/ca/san-francisco/reviews Question: Will external backlinks to domain.com/toys/.../reviews provide any link juice value to it's hierarchical parent page: domain.com/toys/us/ca/san-francisco?
Web Design | | 42Floors0 -
Looking for a developer with Network Solutions platform experience
Looking for a developer with Network Solutions platform experience. 714-744-1926Tony Ashford
Web Design | | OCFurniture0 -
Subdomains, duplicate content and microsites
I work for a website that generates a high amount of unique, quality content. This website though has had development issues with our web builder and they are going to separate the site into different subdomains upon launch. It's a scholarly site so the subdomains will be like history and science and stuff. Don't ask why aren't we aren't using subdirectories because trust me I wish we could. So we have to use subdomains and I'm wondering a couple questions. Will the duplication of coding, since all subdomains will have the same design and look, heavily penalize us and is there any way around that? Also if we generate a good amount of high quality content on each site could we link all those sites to our other site as a possible benefit for link building? And finally, would footer links, linking all the subdirectories, be a good thing to put in?
Web Design | | mdorville0 -
Pin It Button, Too Many Links, & a Javascript question...
One of the sites I work for has some massive on-page link problems. We've been trying to come up with workarounds to lower the amount of links without making drastic changes to the page design and trying to stay within SEO best practices. We had originally considered the NoFollow route a few months back but that's not viable. We changed around some image and text links so they were wrapped together as one link instead of being two links to the same place. We're currently running tests on some pages to see how else to handle the issue. What has me stumped now though is that the damned Pinterest Pin Button counts as an external link and we've added it to every image in our galleries. Originally we found that having a single Pin It button on a page was pulling incorrect images and not listing every possible image on the page... so to make sure that a visitor can pin the exact picture they want, we added the button to everything. We've been seeing a huge uptick in Pinterest traffic so we're definitely happy with that and don't want to get rid of the button. But if we have 300 pictures (which are all links) on a page with Pin It buttons (yet more links) we then have 600+ links on the page. Here's an example page: http://www.fauxpanels.com/portfolio-regency.php When talking with one of my coders, he suggested some form of javascript might be capable of making the button into an event instead of a link and that could be a way to keep the Pin It button while lowering on-page links. I'm honestly not sure how that would work, whether Google would still count it as a link, or whether that is some form of blackhat cloaking technique we should be wary of. Do any of you have experience with similar issues/tactics that you could help me with here? Thanks. TL;DR Too many on page links. Coder suggests javascript "alchemy" to turn lead into gold button links into events. Would this lower links? Or is it bad? Form of Cloaking?
Web Design | | MikeRoberts0 -
Do these links count a duplicate content?
If you do a Google search for the following term it brings up 6 results are these considered duplicate content by Google? Also if so how do I prevent this but still offer other stories to readers of other articles? Google Search Term: site:yakangler.com Okuma helios
Web Design | | mr_w0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0 -
How is link juice split between navigation?
Hey All, I am trying to understand link juice as it relates to duplicate navigation Take for example a site that has a main navigation contained in dropdowns containing 50 links (fully crawl-able and indexable), then in the footer of said page that navigation is repeated so you have a total of 100 links with the same anchor text and url. For simplicity sake will the link juice be divided among those 100 and passed to the corresponding page or does the "1st link rule" still apply and thus only half of the link juice will be passed? What I am getting at is if there was only one navigation menu and the page was passing 50 link juice units then each of the subpages would get passed 1link juice unit right? but if the menu is duplicated than the possible link juice is divided by 100 so only .5 units are being passed through each link. However because there are two links pointing to the same page is there a net of 1 unit? We have several sites that do this for UX reasons but I am trying to figure out how badly this could be hurting us in page sculpting and passing juice to our subpages. Thanks for your help! Cheers.
Web Design | | prima-2535090