Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
-
Hey moz
New client has a site that uses:
subdomains ("third-level" stuff like location.business.com) and;
"fourth-level" subdomains (location.parent.business.com)
Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly.
These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
-
If you check out Rand's Intro to SEO slideshare (http://www.slideshare.net/randfish/introduction-to-seo-5003433) slide 46 and 47 talk about URL structure and specifically sub-domains.
As Rob said you do want to sub-folder structures and avoid sub-domains. Hopefully you are old enough to remember when websites like lycos.com were big and people could make their own websites. These were all hosted on subdomains like moz.tripod.lycos.com and because of this structure search engines needed to see subdomains as separate websites. For this reason they have separate grading, change the flow of link juice and can easily count as duplicate content.
Sub-domains are best utilized for information that is distinct enough. Like in the moz example Rands personal blog could theoretically sit at rand.moz.com as its a separate theme, different content, etc it would just loose out on the flow of value.
Once again Rob is right about using 301 redirects to move your subdomains into folders.
Now moving on to the more specific nature of your question "Are fourth level sub-domains any worse than third level sub-domains" I am going to suggest that when asking such a question you've already lost a big chunk of the SEO/inbound marketing battle.
The question you are framing is "I know it isn't good - but is it any worse?" Well even if it's not any worse you already know that it's not great and you should be taking structural steps to build on a sites accessibility, user functionality and it's SEO. If you find yourself asking "Is X any worse?" "How bad is Y?" "Can I get away with Z?" then you should immediately stop pursuing that idea and try and find a different method.
In this case that method is sub-folders and a 301 migration, but remember the framing of your questions and your over all directional strategy need to change to really drive home your campaigns!
-
HAHA. Great. Thanks for the 'prop's. Going 4th and 5th level deep for sub-domains can also impeed the user experience when wanting to reach it directly (typing it manually is a pain!!)..
Thanks anyways, glad I could be of some help.
-
Again - thanks a lot. I totally agree. Next client meeting I'll stress that not only do Ifeel strongly about the subfolder issue, but the good people at SimplifySEO feel the same:) And they know their ish. Or something.
-
Stay away as much as possible for 4th, 5th and 6th level sub-domains, although I have never seen it go beyond 5. I would really try to emphasize the value of re-tooling the domain structure for long term benefits and linking. Keeping sub-domains running isolates link value and doesn't benefit the entire domain - thus making link building a much harder challenge. You are losing link 'juice' for every level of sub-domain used, as the value drops for each section of the domain that extends - hence the reason sub-folders are the way to go
(as you already know)...
Good luck with the client and site. Sounds like a tough call. All the best and I hope it works out
-
Hey Rob,
Thanks a lot for this. This is great advice and really well-written. And you're preaching to the choir. I also prefer subfolders, but it's just not in the cards for this client for the time being. As it stands, we're stuck with subdomains.
Any other thoughts re: fourth-level vs. third-level domains, folks?
-
Hey there!
You should try to stay away from sub-domains, unless they really serve a purpose for the domain - then different strategies can be put into place. As I don't know if it's the route you need to take, I am going to proceed to give you an alternate option :).
1. You could always use sub-folders which in a nutshell would allow you to build links to the domain on many fronts and have them all count.
** NOTE: any links built to sub-domains don't flow link 'juice' to within the site. Those links build for whatever reason, will only pass value within that specific sub-domain.
2. What I would do, it replicate and migrate the structure of the sub-domains into the root domain of the site (www.site.com/subfolder1/ and 301 and rel-canonical all the sub-domain pages and structure to the new locations. That way, all link juice, value, etc already established is already kept in tact and just redirect all that value, trust and back-links to pages within the domain.
This to me is the best option to relocate the content, improve the domain structure using sub-folders instead of sub-domains, and maintain the back link profile already build (or existing) on the site/domain URL.
Other factors might affect reasons not to pursue this option, but I have always had success with this in large enterprise sites, when wanting to restructure the way domains handle sub-domains
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console "Text too small to read" Errors
What are the guidelines / best practices for clearing these errors? Google has some pretty vague documentation on how to handle this sort of error. User behavior metrics in GA are pretty much in line with desktop usage and don't show anything concerning Any input is appreciated! Thanks m3F3uOI
Technical SEO | | Digital_Reach2 -
Ranking penalty for "accordion" content -- hidden prior to user interaction
Will content inside an "accordion" module be ranked as non-hidden content? Is there an official guide by google and other search engines addressing this? Example of accordion element: https://v4-alpha.getbootstrap.com/components/collapse/#accordion-example Will all elements in the example above be seen + treated equally by search engines?
Technical SEO | | houlihanlokey1 -
Robots.txt on subdomains
Hi guys! I keep reading conflicting information on this and it's left me a little unsure. Am I right in thinking that a website with a subdomain of shop.sitetitle.com will share the same robots.txt file as the root domain?
Technical SEO | | Whittie0 -
Schema markup for products is missing "price": Is this bad?
Hey guys, So a current client of mine has an e-commerce shop with a few hundred products. They purposely choose to keep the prices off of their website, which is causing errors in Google Webmaster Tools. Basically the error shows: Error: Structured Data > Product (markup: schema.org) Error type: missing price 208 items with error Is this a huge deal? Or are we allowed to have non-numerical prices for schema ie. "call for quote"
Technical SEO | | tbinga1 -
Magento Dublicate Content (Noindex and Rel"canonical")
Hi All, Just looking for some advice regarding my website on magento. We by mistake didnt enable canonical tags and noindex tags so had a big problem with dublicate content from filter pages but also have URLs to Cats as Yes so this didnt help with not having canonical tags enabled. We now have everything enabled for a few weeks now but dont see much drop in indexed pages in google. (currently 27k and we have only 5k products) My question basically is how do we speed up noindexation of dublicate content and also would you change URL to cats as No so google just now sees the url to products? (my concerns with this is would leaving it to Yes help because it will hopefully read the canonical tags on products now) Thank you in advance Michael
Technical SEO | | TogetherCare0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Rel="Follow"? What the &#@? does that mean?
I've written a guest blog post for a site. In the link back to my site they've put a rel="follow" attribute. Is that valid HTML? I've Googled it but the answers are inconclusive, to say the least.
Technical SEO | | Jeepster0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0