Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Microsite on subdomain vs. subdirectory
-
Based on this post from 2009, it's recommended in most situations to set up a microsite as a subdirectory as opposed to a subdomain. http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites. The primary argument seems to be that the search engines view the subdomain as a separate entity from the domain and therefore, the subdomain doesn't benefit from any of the trust rank, quality scores, etc. Rand made a comment that seemed like the subdomain could SOMETIMES inherit some of these factors, but didn't expound on those instances.
What determines whether the search engine will view your subdomain hosted microsite as part of the main domain vs. a completely separate site? I read it has to do with the interlinking between the two.
-
I think the footer is the best way to interlink the websites in a non-obtrusive way for users. This should make your main corporate site your top linking site to each subdomain - and this is something you should be able to verify in a tool like Google Webmaster Tools. I do not have any specific examples to support this, but this is a common web practice.
This is not 100% related, but Google recently suggested using Footer links as one way to associate your web content with your Google profile account:
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=1408986
So you can figure if Google looks to footer links to associate authorship - they would likely do the same to relate sites together.
-
Hi Ryan,
Your question is quite interesting. I, myself, went through the article one more time. I have no facts to back up the following, but I hope that it will contribute. FIrst I would go and validate them on webmaster tools. If they are inteded to hit a certain market, I will select that geographical location. Also, I think you have litte to worry about. I imagine that google won't pass certain trust to subdomains, depending on the site. If the number of subdomains is considerable, I would say that they have pretty slim chances of getting some push from the main site. Take for example free webhosting services. They could rank and have decent page rank, if people show interest to the particular subdomain, but is highly unlikely taht to be caused by the authority of the main site.
I haven't seen free hosting subdomain rank well for a long time now. On the other hand you have student and academic accounts on university sites. They all go with subfolders and rank pretty well for highly specific topics. If I have to give a short answer, I would say that is the type of site that makes the difference for google. If your site is considers a casual business website and you are developing a new market then you might not have a problem. If you use sudbomains for specifying product, then you might be ok again.
Google use subdomain for all their major products. For Google pages they used a separate domain. They now redirects to a subdomain sites.google.com. However, they will never give subdomains for personal use. There might be something to that. They do a 301 redirect from a subdomain on googlepages.com to sites.google.com/site/. So what they offer is a 301 redirect to a sub-sub folder, located on a subdomain on Google.
-
Ok. That makes sense. The way our company would use it is having a microsite for specific, focused topics - large enough that warrant their own site. They are clearly part of our overall brand, unlike the Disney properties example. On each of these sites, there will almost always be a link back to the main/corporate website, usually in the footer.
Do you think having one or two links on every page pointing back to company.com would be sufficient to notify search engines that the two are associated, and ultimately give some search value to the subdomain hosted microsite from the main domain?
Are there any studies or evidence supporting any of this?
-
Interlinking is definitely a factor - but content is what matters.
Take the Disney brands that live on Go.com:
They all live on Go.com but Google surely knows they are really separate sites that cover different topics. Same for any blogspot.com, typepad.com, etc. hosted blog. The millions of blogs there cover a wide range of topics and search engines understand that they are not related just because they share the same host domain.
On the other end of the spectrum - if your site just has two subdomains - let's say www.website.com and blog.website.com ... which cover the same topics and link to one another, search engines would more likely associate those two addresses.
-
I don't have an answer to your question, but if you're looking for some more reading about subdomains vs. TLDs, here is a presentation given at MozCon: http://www.distilled.net/blog/seo/mozcon-international-seo/. The slideshow has some info about it, and a bunch of other good stuff.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Personalized Content Vs. Cloaking
Hi Moz Community, I have a question about personalization of content, can we serve personalized content without being penalized for serving different content to robots vs. users? If content starts in the same initial state for all users, including crawlers, is it safe to assume there should be no impact on SEO because personalization will not happen for anyone until there is some interaction? Thanks,
Technical SEO | | znotes0 -
Discrepancy in actual indexed pages vs search console
Hi support, I checked my search console. It said that 8344 pages from www.printcious.com/au/sitemap.xml are indexed by google. however, if i search for site:www.printcious.com/au it only returned me 79 results. See http://imgur.com/a/FUOY2 https://www.google.com/search?num=100&safe=off&biw=1366&bih=638&q=site%3Awww.printcious.com%2Fau&oq=site%3Awww.printcious.com%2Fau&gs_l=serp.3...109843.110225.0.110430.4.4.0.0.0.0.102.275.1j2.3.0....0...1c.1.64.serp..1.0.0.htlbSGrS8p8 Could you please advise why there is discrepancy? Thanks.
Technical SEO | | Printcious0 -
Migrating to new subdomain with new site and new content.
Our marketing department has decided that a new site with new content is needed to launch new products and support our existing ones. We cannot use the same subdomain(www = old subdomain and ww1 = new subdomain)as there is a technically clash between the windows server currently used, and the lamp stack required to run the new wordpress based CMS and site. We also have an aging piece of SAAS software on the www domain which is makes moving it to it's own subdomain far too risky. 301's have been floated as a way of managing the transition. I'm not too keen on that idea due to the double effect of new subdomain and content, and the SEO impact it might have. I've suggested uploading the new site to the new subdomain while leaving the old site in place. Then gradually migrating sections over before turning parts of the old site off and using a 301 at that point to finalise the move. The old site would inform user's there is a new version and it would then convert them to the new site(along with a cookie to auto redirect them in future.) while still leaving the old content in place for existing search traffic, bookmarks and visitors via static URLs. Before turning off sections on the old site we would create rel canonicals to redirect to the new pages based on a a mapped set of URLs(this in itself concerns me as the rel canonical is essentially linking to different content). Would be grateful for any advice on whether this strategy is flawed or whether another strategy might be more suitable?
Technical SEO | | Rezza0 -
Title Tag vs. H1 / H2
OK, Title tag, no problem, it's the SEO juice, appears on SERP, etc. Got it. But I'm reading up on H1 and getting conflicting bits of information ... Only use H1 once? H1 is crucial for SERP Use H1s for subheads Google almost never looks past H2 for relevance So say I've got a blog post with three sections ... do I use H1 three times (or does Google think you're playing them ...) Or do I create a "big" H1 subhead and then use H2s? Or just use all H2s because H1s are scary? 🙂 I frequently use subheads, it would seem weird to me to have one a font size bigger than another, but of course I can adjust that in settings ... Thoughts? Lisa
Technical SEO | | ChristianRubio0 -
Updating inbound links vs. 301 redirecting the page they link to
Hi everyone, I'm preparing myself for a website redesign and finding conflicting information about inbound links and 301 redirects. If I have a URL (we'll say website.com/website) that is linked to by outside sources, should I get those outside sources to update their links when I change the URL to website.com/webpage? Or is it just as effective from a link juice perspective to simply 301 redirect the old page to the new page? Are there any other implications to this choice that I may want to consider? Thanks!
Technical SEO | | Liggins0 -
Meta Description VS Rich Snippets
Hello everyone, I have one question: there is a way to tell Google to take the meta description for the search results instead of the rich snippets? I already read some posts here in moz, but no answer was found. In the post was said that if you have keywords in the meta google may take this information instead, but it's not like this as i have keywords in the meta tags. The fact is that, in this way, the descriptions are not compelling at all, as they were intended to be. If it's not worth for ranking, so why google does not allow at least to have it's own website descriptions in their search results? I undestand that spam issues may be an answer, but in this way it penalizes also not spammy websites that may convert more if with a much more compelling description than the snippets. What do you think? and there is any way to fix this problem? Thanks!
Technical SEO | | socialengaged
Eugenio0 -
Internal vs external blog and best way to set up
I have a client that has two domians registered - one uses www.keywordaustralia.com the other uses www.keywordaelaide.com He had already bought and used the first domain when he came to me I suggested the second as being worth buying as going for a more local keyword would be more appropriate. Now I have suggested to him that a blog would be a worthy use of the second domain and a way to build links to his site - however I am reading that as all links will be from the same site it wont be worth much in the long run and an internal blog is better as it means updated content on his site. should i use the second domain for blog, or just 301 the second domain to his first domain. Or is it viable to use the second domain as the blog and just set up an rss feed on his page ? Is there a way to have the second domain somehow 'linked' to his first domain with the blog so that google sees them as connected ? NOOBIE o_0
Technical SEO | | mamacassi0 -
301 Redirect vs Domain Alias
We have hundreds of domains which are either alternate spelling of our primary domain or close keyword names we didn't want our competitor to get before us. The primary domain is running on a dedicated Windows server running IIS6 and set to a static IP. Since it is a static IP and not using host headers any domain pointed to the static IP will immediately show the contents of the site, however the domain will be whatever was typed. Which could be the primary domain or an alias. Two concerns. First, is it possible that Google would penalize us for the alias domains or dilute our primary domain "juice"? Second, we need to properly track traffic from the alias domains. We could make unique content for those performing well and sell or let expire those that are sending no traffic. It's not my goal to use the alias domains to artificially pump up our primary domain. We have them for spelling errors and direct traffic. What is the best practice for handling one or both of these issues?
Technical SEO | | briankb0