Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
The Great Subdomain vs. Subfolder Debate, what is the best answer?
-
Recently one of my clients was hesitant to move their new store locator pages to a subdomain. They have some SEO knowledge and cited the whiteboard Friday article at https://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday.
While it is very possible that Rand Fiskin has a valid point I felt hesitant to let this be the final verdict. John Mueller from Google Webmaster Central claims that Google is indifferent towards subdomains vs subfolders.
https://www.youtube.com/watch?v=9h1t5fs5VcI#t=50
Also this SEO disagreed with Rand Fiskin’s post about using sub folders instead of sub domains. He claims that Rand Fiskin ran only 3 experiments over 2 years, while he has tested multiple subdomain vs subfolder experiments over 10 years and observed no difference.
http://www.seo-theory.com/2015/02/06/subdomains-vs-subfolders-what-are-the-facts-on-rankings/
Here is another post from the Website Magazine. They too believe that there is no SEO benefits of a subdomain vs subfolder infrastructure. Proper SEO and infrastructure is what is most important.
Again Rand might be right, but I rather provide a recommendation to my client based on an authoritative source such as a Google engineer like John Mueller.
Does anybody else have any thoughts and/or insight about this?
-
I think Mueller's main point may be that if you treat your subdomains separately from your main site, Google will treat them differently as well. For example, if you have three subdomains - www, blog and cloud - but all of them have different navigation, css and limited interlinking and little keyword theme commonality, Google will treat them as separate sites and you will suffer the dreaded subdomain issue.
BUT if you integrate the three domains well - same nav, same look & feel and lots of good contextual anchor text interlinking, Google will treat it as the same site and the subdomain issue will become moot.
Has anyone done any testing with those variables?
-
Yup! All the case studies I showed above (and plenty since) have demonstrated that you can boost traffic by moving from the subdomain to a subfolder.
-
Great thread! What about a situation where a blog already sits on a subdomain (bearing in mind it hasn't been driving a significant amount of traffic as the site is fairly new). My recommendation would be to move to subfolder, would you agree?
Thank you!
-
This is my new favorite quote... "I understand that Google's representatives have the authority of working at Google going for them, but I also believe they're wrong." (Rand Fishkin)
-
Greetings All,
So the debate goes on and I personally think the value of subfolders versus directories certainly makes sense especially from a linking, age and juice perspective. I do notice in most articles they talk about the benefits for subfolders as it relates to blogs. In past tests and studies, you have shed any insight into how this may affect ecommerce as it relates to countries.
We currently have each country on a subdomain and can run it through webmaster tools and geotarget the country however are considering switching to subfolders, based on all the articles we've read. This would in such drive many more links back to each new subfolder assuming the majority of our links are from "www". It would seem to make sense to switch to subfolders and would be especially helpful as new sub-folders were launched.
I was just wondering if the same argument can be made when it comes to ecommerce and country specific sites. Each site (currently different subdomains) uses a different language and currency. Meta and content is different for each. We launched "www" over 15 years ago but in the past 2 years have introduced various subdomains (ie new languages). As we enter into new countries, we are considering switching everything over to subfolders (obviously with 301'ing the subdomains over to the new subfolders so we dont lose all our existing links).
Im assuming since your studies indicate, you'd think this to be a good idea however all the talk has not been so much about countries and ecommerce. Any one have any light or information they can share with regards to the topic??
Thnkxs
-
Hi Rosemary - thankfully, I have data, not just opinions to back up my arguments:
- In 2014, Moz moved our Beginner's Guide to SEO from guides.moz.com to moz.com itself. Rankings rose immediately, with no other changes. We ranked higher not only for "seo guide" (outranking Google themselves) but also for "beginners guide" a very broad phrase.
- Check out https://iwantmyname.com/blog/2015/01/seo-penalties-of-moving-our-blog-to-a-subdomain.html - goes into very clear detail about how what Google says about subdomains doesn't match up with realities
- Check out some additional great comments in this thread, including a number from site owners who moved away from subdomains and saw ranking benefits, or who moved to them and saw ranking losses: https://inbound.org/discuss/it-s-2014-what-s-the-latest-thinking-on-sub-domains-vs-sub-directories
- There's another good thread (with some more examples) here: https://inbound.org/blog/the-sub-domain-vs-sub-directory-seo-debate-explained-in-one-flow-chart
Ultimately, it's up to you. I understand that Google's representatives have the authority of working at Google going for them, but I also believe they're wrong. It could be that there's no specific element that penalized subdomains and maybe they're viewed the same in Google's thinking, but there are real ways in which subdomains inherit authority that stay unique to those subdomains and it IS NOT passed between multiple subdomains evenly or equally. I have no horse in this race other than to want to help you and other site owners from struggling against rankings losses - and we've just seen too many when moving to a subdomain and too many gains moving to a subfolder not to be wary.
-
Hi,
I've not seen any comment from Googlers regarding this debate. I realize I'm keeping this in the Moz-sphere, which isn't quite what you're looking for, but this quote is from Moz's domain setup guide:
"Since search engines keep different metrics for domains than they do subdomains, it is recommended that webmasters place link-worthy content like blogs in subfolders rather than subdomains. (i.e. www.example.com/blog/ rather than blog.example.com) The notable exceptions to this are language-specific websites. (i.e., en.example.com for the English version of the website)."
I think that quote is pretty compelling towards the subdirectory side of this quandry. I also recommend checking out the comments on the Whiteboard Friday link you posted, there is plenty of evidence there as well.
Unfortunately, this debate will probably go on forever until we get definitive word from Google.
-
Can you share some details why you want to "move" the store locator to a subdomain? That makes me think it is already operational in a subfolder at the moment. In general, I would recommend not moving content unless there is a very good reason for it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to create a smooth blog migration from subdomain to subfolder main?
Hi mozzers, We have decided to migrate the blog subdomain to the domain's subfolder (blog.example.com to example.com/blog). To do this the most effective way and avoid impact SEO negatively I believe I have to follow this checklist: Create a list of all 301 redirects from blog.example.com/post-1 to example.com/post-1 Make sure title tags remain the same on main domain Make sure internal links remain the same Is there something else I am missing? Any other best practices? I also would like to have all blog post as AMPs. Any recommendations if this something we should do since we are not a media site? Any other tips on successfully implementing those types of pages? Thanks
Intermediate & Advanced SEO | | Ty19861 -
What is the best way to find related forums in your industry?
Hi Guys, Just wondering what is the best way to find forums in your industry?
Intermediate & Advanced SEO | | edward-may2 -
Slug best practices?
Hello, my team is trying to understand how to best construct slugs. We understand they need to be concise and easily understandable, but there seem to be vast differences between the three examples below. Are there reasons why one might be better than the others? http://www.washingtonpost.com/news/morning-mix/wp/2014/06/20/bad-boys-yum-yum-violent-criminal-or-not-this-mans-mugshot-is-heating-up-the-web/ http://hollywoodlife.com/2014/06/20/jeremy-meeks-sexy-mug-shot-felon-viral/ http://www.tmz.com/2014/06/19/mugshot-eyes-felon-sexy/
Intermediate & Advanced SEO | | TheaterMania0 -
Subdomains vs directories on existing website with good search traffic
Hello everyone, I operate a website called Icy Veins (www.icy-veins.com), which gives gaming advice for World of Warcraft and Hearthstone, two titles from Blizzard Entertainment. Up until recently, we had articles for both games on the main subdomain (www.icy-veins.com), without a directory structure. The articles for World of Warcraft ended in -wow and those for Hearthstone ended in -hearthstone and that was it. We are planning to cover more games from Blizzard entertainment soon, so we hired a SEO consultant to figure out whether we should use directories (www.icy-veins.com/wow/, www.icy-veins.com/hearthstone/, etc.) or subdomains (www.icy-veins.com, wow.icy-veins.com, hearthstone.icy-veins.com). For a number of reason, the consultant was adamant that subdomains was the way to go. So, I implemented subdomains and I have 301-redirects from all the old URLs to the new ones, and after 2 weeks, the amount of search traffic we get has been slowly decreasing, as the new URLs were getting index. Now, we are getting about 20%-25% less search traffic. For example, the week before the subdomains went live we received 900,000 visits from search engines (11-17 May). This week, we only received 700,000 visits. All our new URLs are indexed, but they rank slightly lower than the old URLs used to, so I was wondering if this was something that was to be expected and that will improve in time or if I should just go for subdomains. Thank you in advance.
Intermediate & Advanced SEO | | damienthivolle0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
How to add subdomains to webmaster tools?
Can anyone help with how I add a sub domain to webmaster tools? Also do I need to create a seperate sitemap for each sub domain? Any help appreciated!
Intermediate & Advanced SEO | | SamCUK1 -
PDFs and images in Sub folder or subdomain?
What would you recommend as best practice? Our ecommerce site has a lot of PDFs supporting the product page. Currently they are kept in a sub domain and so are all images. Would it be better to keep them all in a subfolder? I've read about blogs being hosted on a subfolder to be better than subdomain but what about pdfs and images? thoughts?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
800 Number vs. Local Phone
I have a client with multiple locations throughout the US. They are currently using different 800 numbers on their site for their different locations. As they try to optimize their local presence but submitting to local directories, we are trying to determine two things: Does having a local number reroute to an 800 number devalue the significance of it being a local number (I've never heard of this, but someone told them it did) Locality and consistency are important. Assuming they can't remove the 800 numbers from the site, are they better off keeping the 800 numbers on their site and using local numbers every else online OR just using the 800 numbers for all of their local listings?
Intermediate & Advanced SEO | | Caleone0