Subdomain vs Subdirectory - does the content make a difference?
-
So I've read through all of the answers that suggest using a subdirectory is the best way to approach this - you rank more quickly and have all of your content on one site. BUT what if you're looking to move into a totally new market that your current site/content isn't in any way relevant to?
Some examples are Supermarkets such as Tesco (who seem to use a mix of methods) http://www.tesco.com/groceries/, http://www.clothingattesco.com/, http://www.tesco.com/bank/ which links out from their main site to http://www.tescobank.com/ etc and Sainsburys http://www.sainsburys.co.uk/ who use subdomains - here they have their grocery offering, their bank offering, clothes, phones etc split into subdomains.
If you have a product that is totally new to your Brand and different from all the products on your current site, does this change the answer to subdirectory vs subdomain?
Would be great to hear your expert opinions on this.
Thanks
-
for the subdomain to domain issue:
From a SEO perspective a subdomain is less favorable.From a user perpective: Please explain to my father the domain zoekmachinemarketing.stramark.nl how are you going to explain that there should not be a www. in front of it? how are you going to explain the fact that it is not only stramark he has to go to, but actualy the subdomain because it has a different offer?
I think young people can adapt somewhat better, but they are very used that they do not have to think. They just search from the adres bar and need the top result.
-
I agree with what John Cross said here - multiple domains means more work. If there is a business case to justify that increase in work, then that is an easier decision. If there isn't enough business case to justify the work, then maybe from an SEO standpoint you should keep it on the same domain to get the new content ranking more quickly.
Along with SEO considerations, though, there are a few other ways to break down this question...
First, what are the user expectations? Yes, the products are different and not highly related but are the customers different? In the Tesco example, would people who are interested in groceries also be interested in banking? Or, put another way, would people who are interested in groceries (but not in banking) be offended to see that this company also offers banking services? If the users are interconnected or are (at minimum) not put off by the variety of products, then why not have everything on one domain? That way you get the strong SEO benefit of using sub-directories. This isn't always a cheap investment though, as it requires a strong architecture to keep the directories and content types/voices distinct, but totally doable and a good solution from an SEO standpoint.
Second, I'd look at this from a brand perspective. Is this all the same company delivering these goods? Is it all Tesco or Sainsburys? If it is the same brand name, then why not have everything live on one authoritative domain name (assuming you aren't going to chase away customers by showing the breadth of products offered)? Google is an example of this - look at the wide variety of services they offer mail, analytics, drive, G+, search, etc. - it is all Google, even though they offer a wide range of products to a diverse range of customers. Now, if New Product A is a different brand and a really different thing from anything else being done by the company (in Google's case - Android), then that likely justifies a separate domain and a larger business investment (not just for SEO, but for design and other types of marketing too).
Finally, you do need to look at this technically I think. Chances are that Tesco Bank has to live on a different domain just because of security considerations. Some times the technology limitations have to dictate what we do with SEO. If those are great enough, then we may have to do the work to create two distinct domains and get those domains earning rankings/traffic. In that case, the business/technical needs justify the work required.
Hope that helps!
-
To optimize SEO outcomes the short answer answer would use your current domain.
However a counter argument could be you own an exact matching domain to keywords so that maybe push you to a new URL. Big marketing budget or maybe you just want a clean start - because of pigeon or panda issues plaguing teh current site.
That said using Tesco & Sainsbury as examples both have in common "big wallets". So they would have planned multi million dollar marketing campaigns around the new products/URL's. Hence they can drive backlinks. So if the company is a monster - with a massive marketing spend for the launch you may think a new brand and URL are in order.
I am old school - a brand new domain to start from scratch - new domain, no history and no backlinks is a far harder task, but certainly not unachievable. I would steer form it. Personally I believe you should try and limit new domains as practically it increases your required SEO output in this case by double. Have to review two lost of GA and Webmaster each day... So just to keep level you need to work extra hours each week with a new domain...
They are my views but there is plenty of info on moz heading the other way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Audit Questions
Hi Mozzers Having worked on my companies site for a couple of months now correcting many issues, im now ready to begin looking at a content review, many areas of the site contain duplicate content, the main causes being 1. Category Page Duplications
Intermediate & Advanced SEO | | ATP
e.g.
Widget Page Contains ("Blue Widget Extract")
Widget Page Contains ("Red Widget Extract")
Blue Widget Page Contains ("Same Blue Widget Extract")
Red Widget Page Contains ("Same Red Widget Extract") 2. Product Descriptions
Item 1 (Identical to item 2 with the exception of a few words and technical specs)
Item 2 Causing almost all the content on the site to get devalued. Whilst i've cleared all moz errors and warnings im certain this is causing devaluation of most of the website. I was hoping you could answer these questions so I know what to expect once i have made the changes. Will the pages that had duplicate content recover once they possess unique content or should i expect a hard and slow climb back? The website has never receive any warnings from Google, does this mean recovery for penalties like duplicate content will be quicker Several pages rank on page 1 for fairly competitive keywords despite having duplicate content and keyword spammy content. What are the chances of shooting myself in the foot by editing this content? I know I will have to wait for google to crawl the pages before i see any reflection in the changes, but how long after google has crawled the page should I get a realistic idea of how positive the changes were? As always, thanks for you time!0 -
Title Length Vs Keywords
Hello all, I've been talking with an SEO expert who convinced me to add more keywords to my titles of a section of our site which is updated with products daily. I can see the logic and I do prefer having these additional keywords. The problem now is in Moz it says we have over 2,000 pages with title elements that are too long, which is true they are all over the 70 character limit. Is this a problem SEO wise? Speaking to our SEO expert they said it's not ideal from a user point of view as you can't see the full title, but are we going to be upsetting Google by having 150+ character titles? Thanks!
Intermediate & Advanced SEO | | HB171 -
Blog subdomain not redirecting
Over the last few weeks I have been focused on fixing high and medium priority issues, as reported by the Moz crawler, after a recent transition to WordPress. I've made great progress, getting the high priority issues down from several hundred (various reasons, but many duplicates for things like non-www and www versions) to just five last week. And then there's this weeks report. For reasons I can't fathom, I am suddenly getting hundreds of duplicate content pages of the form http://blog.<domain>.com</domain> (being duplicates with the http://www.<domain>.com</domain> versions). I'm really unclear on why these suddenly appeared. I host my own WordPress site ie WordPress.org stuff. In Options / General everything refers to http://www.<domain>.com</domain> and has done for a number of weeks. I have no idea why the blog versions of the pages have suddenly appeared. FWIW, the non-www version of my pages still redirect to the www version, as I would expect. I'm obviously pretty concerned by this so any pointers greatly appreciated. Thanks. Mark
Intermediate & Advanced SEO | | MarkWill0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
Google is mixing subdomains. What can we do?
Hi! I'm experiencing something that's kind of strange for me. I have my main domain let's say: www.domain.com. Then I have my mobile version in a subdomain: mobile.domain.com and I also have a german version of the website de.domain.com. When I Google my domain I have the main result linking to: www.domain.com but then Google mixes all the domains in the sites links. For example a Sing in may be linking mobile.domain.com, a How it works link may be pointing to de.domain.com, etc What's the solution? I think this is hurting a lot my position cause google sees that all are the same domain when clearly is not. thanks!!
Intermediate & Advanced SEO | | fabrizzio0 -
HubPages, Squidoo and subdomains
Just want to check my thinking on something. So, Google says subdomains stand on their own right? They don't get juice from the root domain. If this is true, the subdomains on a site like HubPages or WordPress.com are essentially a PR0 domain, right? Something like, mysub.hubpages.com. But if you posted an article on Squidoo, a site that doesn't use subdomains, you should get some juice from the root domain passed to your post, right? I usually go the guest blogging route, but I recently read a couple of posts on Web 2.0 link wheels swearing they are awesome, but most of the time the recommendation is to build it using what I perceive to be PR0 sites - Wordpress.com, Tumblr, HubPages subdomains. You would have to develop those sites so they have PR in order to pass juice. Am I off base here or does building a link wheel in this way seem like a waste of time?
Intermediate & Advanced SEO | | friendlymachine0 -
Pages with Little Content
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed. I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed. Here is an example page with very little content
Intermediate & Advanced SEO | | andywozhere0 -
How to manage duplicate content?
I have a real estate site that contains a large amount of duplicate content. The site contains listings that appear both on my clients website and on my competitors websites(who have better domain authority). It is critical that the content is there because buyers need to be able to find these listings to make enquiries. The result is that I have a large number pages that contain duplicate content in some way, shape or form. My search results pages are really the most important ones because these are the ones targeting my keywords. I can differentiate these to some degree but the actual listings themselves are duplicate. What strategies exist to ensure that I'm not suffereing as a result of this content? Should I : Make the duplicate content noindex. Yes my results pages will have some degree of duplicate content but each result only displays a 200 character summary of the advert text so not sure if that counts. Would reducing the amount of visible duplicate content improve my rankings as a whole? Link back to the clients site to indicate that they are the original source Any suggestions?
Intermediate & Advanced SEO | | Mulith0