"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
-
Hey moz
New client has a site that uses:
subdomains ("third-level" stuff like location.business.com) and;
"fourth-level" subdomains (location.parent.business.com)
Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly.
These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
-
If you check out Rand's Intro to SEO slideshare (http://www.slideshare.net/randfish/introduction-to-seo-5003433) slide 46 and 47 talk about URL structure and specifically sub-domains.
As Rob said you do want to sub-folder structures and avoid sub-domains. Hopefully you are old enough to remember when websites like lycos.com were big and people could make their own websites. These were all hosted on subdomains like moz.tripod.lycos.com and because of this structure search engines needed to see subdomains as separate websites. For this reason they have separate grading, change the flow of link juice and can easily count as duplicate content.
Sub-domains are best utilized for information that is distinct enough. Like in the moz example Rands personal blog could theoretically sit at rand.moz.com as its a separate theme, different content, etc it would just loose out on the flow of value.
Once again Rob is right about using 301 redirects to move your subdomains into folders.
Now moving on to the more specific nature of your question "Are fourth level sub-domains any worse than third level sub-domains" I am going to suggest that when asking such a question you've already lost a big chunk of the SEO/inbound marketing battle.
The question you are framing is "I know it isn't good - but is it any worse?" Well even if it's not any worse you already know that it's not great and you should be taking structural steps to build on a sites accessibility, user functionality and it's SEO. If you find yourself asking "Is X any worse?" "How bad is Y?" "Can I get away with Z?" then you should immediately stop pursuing that idea and try and find a different method.
In this case that method is sub-folders and a 301 migration, but remember the framing of your questions and your over all directional strategy need to change to really drive home your campaigns!
-
HAHA. Great. Thanks for the 'prop's. Going 4th and 5th level deep for sub-domains can also impeed the user experience when wanting to reach it directly (typing it manually is a pain!!)..
Thanks anyways, glad I could be of some help.
-
Again - thanks a lot. I totally agree. Next client meeting I'll stress that not only do Ifeel strongly about the subfolder issue, but the good people at SimplifySEO feel the same:) And they know their ish. Or something.
-
Stay away as much as possible for 4th, 5th and 6th level sub-domains, although I have never seen it go beyond 5. I would really try to emphasize the value of re-tooling the domain structure for long term benefits and linking. Keeping sub-domains running isolates link value and doesn't benefit the entire domain - thus making link building a much harder challenge. You are losing link 'juice' for every level of sub-domain used, as the value drops for each section of the domain that extends - hence the reason sub-folders are the way to go (as you already know)...
Good luck with the client and site. Sounds like a tough call. All the best and I hope it works out
-
Hey Rob,
Thanks a lot for this. This is great advice and really well-written. And you're preaching to the choir. I also prefer subfolders, but it's just not in the cards for this client for the time being. As it stands, we're stuck with subdomains.
Any other thoughts re: fourth-level vs. third-level domains, folks?
-
Hey there!
You should try to stay away from sub-domains, unless they really serve a purpose for the domain - then different strategies can be put into place. As I don't know if it's the route you need to take, I am going to proceed to give you an alternate option :).
1. You could always use sub-folders which in a nutshell would allow you to build links to the domain on many fronts and have them all count.
** NOTE: any links built to sub-domains don't flow link 'juice' to within the site. Those links build for whatever reason, will only pass value within that specific sub-domain.
2. What I would do, it replicate and migrate the structure of the sub-domains into the root domain of the site (www.site.com/subfolder1/ and 301 and rel-canonical all the sub-domain pages and structure to the new locations. That way, all link juice, value, etc already established is already kept in tact and just redirect all that value, trust and back-links to pages within the domain.
This to me is the best option to relocate the content, improve the domain structure using sub-folders instead of sub-domains, and maintain the back link profile already build (or existing) on the site/domain URL.
Other factors might affect reasons not to pursue this option, but I have always had success with this in large enterprise sites, when wanting to restructure the way domains handle sub-domains
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Staging site and "live" site have both been indexed by Google
While creating a site we forgot to password protect the staging site while it was being built. Now that the site has been moved to the new domain, it has come to my attention that both the staging site (site.staging.com) and the "live" site (site.com) are both being indexed. What is the best way to solve this problem? I was thinking about adding a 301 redirect from the staging site to the live site via HTACCESS. Any recommendations?
Technical SEO | | melen0 -
Should we move our documentation off subdomain?
Background: We have a popular open source e-commerce platform at http://spreecommerce.com. Right now the documentation is on http://guides.spreecommerce.com. We have "edge" documentation (for stuff that's not yet released) on http://edgeguides.spreecommerce.com but since it's largely duplicative we've told google not to index any of the edge stuff (via robots.txt). Question: Should we consider moving the guides under the main website under /docs or something like this? There's a ton of great content that people often read to learn more about the platform. Seems like we might be diluting our juice a bit to have it on a separate domain. WDYT?
Technical SEO | | schof0 -
User Created Subdomain Help
Have I searched FAQ: Yes My issue is unique because of the way our website works and I hope that someone can provide some guidance on this.Our website http://breezi.com is a website builder where users can build their own website. When users build their site it creates a sub-domain route to their created site, for example: http://mike.breezi.com. Now that I have explained how our site works here is the problem: Google Webmaster Tools and Bing Webmaster Tools are indexing ALL the user created websites under our TLD and thus it is our impression that any content created in those sub-domains can confuse the search engine to thinking that the user created website and content is relevant to _OUR _main sitehttp://breezi.com. So, what we would like to know if there is a way to let search engines know that the user created sites and content is not related to our TLD site. Thanks for any help and advise.
Technical SEO | | breezi0 -
Same URL in "Duplicate Content" and "Blocked by robots.txt"?
How can the same URL show up in Seomoz Crawl Diagnostics "Most common errors and warnings" in both the "Duplicate Content"-list and the "Blocked by robots.txt"-list? Shouldnt the latter exclude it from the first list?
Technical SEO | | alsvik0 -
"Spam emails" : ranking drop?
Hello, Is it possible that a website gets penalised by Google because your hosting company blocked you from sending emails? Basically I got a message from my hosting company saying that they were blocking me from sending emails from our server and domain because too many had mistakes or were complained about. The same day we dropped from 2<sup>nd</sup> on a keyword to about 600<sup>th</sup> while still being ranked for other keywords. The drop was for our main keyword. Can the fact we sent “bad emails” be related to a rank drop? For the record, those were confiormation emails for account creation, they were legit, not spam. That's off-topic though.
Technical SEO | | EndeR-0 -
I have a lot of warnings for "Overly-Dynamic URL"
I have a lot of warnings for "Overly-Dynamic URLs" but all the pages listed have a canonical with a static url , does this mean that I can ignore the warnings? Seems to me that I can but I just want to make sure?
Technical SEO | | Arnx1 -
URL Structure "-" vs "/"? Are there any advantages to one over the other?
An example would be domain.com/keyword/keyword2 vs domain.com/keyword-keyword2 Are there any advantages / disadvantages to one over the other?
Technical SEO | | nicole.healthline0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0