Subdomain vs Subdirectory - does the content make a difference?
-
So I've read through all of the answers that suggest using a subdirectory is the best way to approach this - you rank more quickly and have all of your content on one site. BUT what if you're looking to move into a totally new market that your current site/content isn't in any way relevant to?
Some examples are Supermarkets such as Tesco (who seem to use a mix of methods) http://www.tesco.com/groceries/, http://www.clothingattesco.com/, http://www.tesco.com/bank/ which links out from their main site to http://www.tescobank.com/ etc and Sainsburys http://www.sainsburys.co.uk/ who use subdomains - here they have their grocery offering, their bank offering, clothes, phones etc split into subdomains.
If you have a product that is totally new to your Brand and different from all the products on your current site, does this change the answer to subdirectory vs subdomain?
Would be great to hear your expert opinions on this.
Thanks
-
for the subdomain to domain issue:
From a SEO perspective a subdomain is less favorable.From a user perpective: Please explain to my father the domain zoekmachinemarketing.stramark.nl how are you going to explain that there should not be a www. in front of it? how are you going to explain the fact that it is not only stramark he has to go to, but actualy the subdomain because it has a different offer?
I think young people can adapt somewhat better, but they are very used that they do not have to think. They just search from the adres bar and need the top result.
-
I agree with what John Cross said here - multiple domains means more work. If there is a business case to justify that increase in work, then that is an easier decision. If there isn't enough business case to justify the work, then maybe from an SEO standpoint you should keep it on the same domain to get the new content ranking more quickly.
Along with SEO considerations, though, there are a few other ways to break down this question...
First, what are the user expectations? Yes, the products are different and not highly related but are the customers different? In the Tesco example, would people who are interested in groceries also be interested in banking? Or, put another way, would people who are interested in groceries (but not in banking) be offended to see that this company also offers banking services? If the users are interconnected or are (at minimum) not put off by the variety of products, then why not have everything on one domain? That way you get the strong SEO benefit of using sub-directories. This isn't always a cheap investment though, as it requires a strong architecture to keep the directories and content types/voices distinct, but totally doable and a good solution from an SEO standpoint.
Second, I'd look at this from a brand perspective. Is this all the same company delivering these goods? Is it all Tesco or Sainsburys? If it is the same brand name, then why not have everything live on one authoritative domain name (assuming you aren't going to chase away customers by showing the breadth of products offered)? Google is an example of this - look at the wide variety of services they offer mail, analytics, drive, G+, search, etc. - it is all Google, even though they offer a wide range of products to a diverse range of customers. Now, if New Product A is a different brand and a really different thing from anything else being done by the company (in Google's case - Android), then that likely justifies a separate domain and a larger business investment (not just for SEO, but for design and other types of marketing too).
Finally, you do need to look at this technically I think. Chances are that Tesco Bank has to live on a different domain just because of security considerations. Some times the technology limitations have to dictate what we do with SEO. If those are great enough, then we may have to do the work to create two distinct domains and get those domains earning rankings/traffic. In that case, the business/technical needs justify the work required.
Hope that helps!
-
To optimize SEO outcomes the short answer answer would use your current domain.
However a counter argument could be you own an exact matching domain to keywords so that maybe push you to a new URL. Big marketing budget or maybe you just want a clean start - because of pigeon or panda issues plaguing teh current site.
That said using Tesco & Sainsbury as examples both have in common "big wallets". So they would have planned multi million dollar marketing campaigns around the new products/URL's. Hence they can drive backlinks. So if the company is a monster - with a massive marketing spend for the launch you may think a new brand and URL are in order.
I am old school - a brand new domain to start from scratch - new domain, no history and no backlinks is a far harder task, but certainly not unachievable. I would steer form it. Personally I believe you should try and limit new domains as practically it increases your required SEO output in this case by double. Have to review two lost of GA and Webmaster each day... So just to keep level you need to work extra hours each week with a new domain...
They are my views but there is plenty of info on moz heading the other way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 vs 410 for subdirectory that was moved to a new domain, 2-years later
Hi all, I've read a lot about 301 vs 404 and 410s, but the case is pretty unique so I decided to get some feedback from you. Both websites are travel related but we had one destination as a subdirectory of the other one (two neighboring countries, where more than 90% of business was related to the 'main' destination and the rest to the 'satellite'). This was obviously bad practice and we decided to move the satellite destination to its own domain. Everything was done 2 years ago and we opted for 301s to the new domain as we had some good links pointing to satellite content. (All of the moved content is destination specific and still relevant) Few weeks back we figured out that google still shows our subdirectory when doing specific 'site:' search and looking further into it, we realized we still get traffic for satellite destination through the main website via links acquired before the move. Not a lot of hits, but they still sporadically occur. A decision was made (rather hastily) to 410 pages and see if that will make satellite subdir pages not show in google searches. So 3 weeks in, 410 errors are climbing in GWMT, but satellite subdirectory still shows in google searches. One part of the team is pushing to put back in place 301s. The other part of the team is concerned with the 'health' of the main website as those pages are not relevant for it, and want them gone . What would you do?
Intermediate & Advanced SEO | | halloranc0 -
ROI on Policing Scraped Content
Over the years, tons of original content from my website (written by me) has been scraped by 200-300 external sites. I've been using Copyscape to identify the offenders. It is EXTREMELY time consuming to identify the site owners, prepare an email with supporting evidence (screen shots), and following up 2, 3, 15 times until they remove the scraped content. Filing DMCA takedowns are a final option for sites hosted in the US, but quite a few of the offenders are in China, India, Nigeria, and other places not subject to DMCA. Sometimes, when a site owner takes down scraped content, it reappears a few months or years later. It's exasperating. My site already performs well in the SERPs - I'm not aware of a third party site's scraped content outperforming my site for any search phrase. Given my circumstances, how much effort do you think I should continue to put into policing scraped content?
Intermediate & Advanced SEO | | ahirai1 -
International Subdomain Headache
My client set up a separate domain for their international clients, then set up separate subdomains for each country where they're active (so, for example, the original site is xx.com and the global is xxworldwide.com, with subdomains like mx.xxxworldwide.com). They auto-translated a large amount of content and put the translations on those international sites. The idea was to draw in native speakers. Now, I don't think this is a great practice, obviously, and I'm worried that it could hurt their original site (the xxx.com in the example above). My concern is that Google will see through the translated text, since it was handled with Google Translate, and penalize both sites. I don't think the canonical tag applies here, since Google recommends a no-follow for autotranslated text, but I've also never dealt with this type of situation before. Anyways, if you made it through all of that, congratulations. My question is whether xxx.com is getting any negative effects other than a potential loss of link juice -- and whether there's any legitimate way to present auto-translated text with a few minor changes without incurring a penalty.
Intermediate & Advanced SEO | | Ask44435230 -
Is This Considered Duplicate Content?
My site has entered SEO hell and I am not sure how to fix it. Up until 18 months ago I had tremendous success on Google and Bing and now my website appears below my Facebook page for the term "Direct Mail Raleigh." What makes it even more frustrating is my competitors have done no SEO and they are dominating this keyword. I thought that the issue was due to harmful inbound links and two months ago I disavowed ones that were clearly spam. Somehow my site has actually gone down! I have a blog that I have updated infrequently and I do not know if it I am getting punished for duplicate content. On Google Webmaster Tools it says I have 279 crawled and indexed pages. Yesterday when I ran the MOZ crawl check I was amazed to find 1150 different webpages on my site. Despite the fact that it does not appear on the webmaster tools I have three different webpages due to the format that the Wordpress blog was created: "http://www.marketplace-solutions.com/report/part2leadershi/", "http://www.marketplace-solutions.com/report/page/91/" and "http://www.marketplace-solutions.com/report/category/competent-leadership/page/3/" What does not make sense to me is why Google only indexed 279 webpages AND why MOZ did not identify these three webpages as duplicate content with the Crawl Test Tool. Does anyone have any ideas? Would it be as easy as creating a massive robot.txt file and just putting 2 of the 3 URLs in that file? Thank you for your help.
Intermediate & Advanced SEO | | DR700950 -
Duplicate content reported on WMT for 301 redirected content
We had to 301 redirect a large number of URL's. Not Google WMT is telling me that we are having tons of duplicate page titles. When I looked into the specific URL's I realized that Google is listing an old URL's and the 301 redirected new URL as the source of the duplicate content. I confirmed the 301 redirect by using a server header tool to check the correct implementation of the 301 redirect from the old to the new URL. Question: Why is Google Webmaster Tool reporting duplicated content for these pages?
Intermediate & Advanced SEO | | SEOAccount320 -
How do I list the subdomains of a domain?
Hi Mozers, I am trying to find what subdomains are currently active on a particular domain. Is there a way to get a list of this information? The only way I could think of doing it is to run a google search on; site:example.com -site:www.example.com The only issues with this approach is that a majority of the indexed pages exist on the non-www domain and I still have thousands of pages in the results (mainly from the non-www). Is there another way to do it in Google? OR is there a server admin online tool that will tell me this information? Cheers, Dan
Intermediate & Advanced SEO | | djlaidler0 -
404 for duplicate content?
Sorry, I think this is my third question today... But I have a lot of duplicated content on my site. I use joomla so theres a lot of unintentional duplication. For example, www.mysite.com/index.php exists, etc. Up till now, I thought I had to 301 redirect or rel=canonical these "duplicated pages." However, can I just 404 it? Is there anything wrong with this rpactice in regards to SEO?
Intermediate & Advanced SEO | | waltergah0 -
Duplicate content - canonical vs link to original and Flash duplication
Here's the situation for the website in question: The company produces printed publications which go online as a page turning Flash version, and as a separate HTML version. To complicate matters, some of the articles from the publications get added to a separate news section of the website. We want to promote the news section of the site over the publications section. If we were to forget the Flash version completely, would you: a) add a canonical in the publication version pointing to the version in the news section? b) add a link in the footer of the publication version pointing to the version in the news section? c) both of the above? d) something else? What if we add the Flash version into the mix? As Flash still isn't as crawlable as HTML should we noindex them? Is HTML content duplicated in Flash as big an issue as HTML to HTML duplication?
Intermediate & Advanced SEO | | Alex-Harford0