Subdomain cannibalization
-
Hi,
I am doing the SEO for a webshop, which has a lot of linking and related websites on the same root domain. So the structure is for example:
Root domain: example.com
Shop: shop.example.com
Linking websites to shop: courses.example.com, software.example.com,...Do I have to check which keywords these linking websites are already ranking for and choose other keywords for my category and product pages on the webshop? The problem with this could be that the main keywords for the category pages on the webshop are mainly the same as for the other subdomains.
The intention is that some people immediately come to the webshop instead of going first to the linking websites and then to the webshop.
Thanks.
-
Hello Mat,
I don't think I'm seeing the same SERPs as you. Is there any way you could give me an example of one of these subdomains?
And yes, you're absolutely right that the same problem of keyword cannibalization would apply to subdirectories as well.
If it's the woltersk....lu domain I am getting non-secure warnings from Firefox when I try to access it.
How many different subdomains are there / will there be? Is it just shop.domain.lu and www.domain.lu or are there others? I didn't see any for "courses." or "software." in the SERP example you provided with the link. If it's just one, I think that's manageable. For example, maybe www. could focus on informational queries (e.g. JavaScript course) and shop. could focus on transactional ones (e.g. Buy Acme JavaScript course). Maybe one could focus on reviews and comparisons, or long-tail queries while the other focuses on short-tail queries. Without knowing more about the domains and your business, it is difficult for me to say. If you have three or four subdomains all going after the same keywords, that's definitely a problem and I don't think you can avoid cannibalization. At that point, it would be best to choose the strongest domain/subdomain and focus your efforts on ranking one of them instead of watering down your efforts over several.
-
Thanks for your answer Everett.
The structure was indeed created some years ago, when ranking with different subdomains wasn't really a problem. It is quite normal that there is an overlap between the webshop subdomain and other subdomains. The subdomains dive deeper into a specific part of the business (tax, legal, formations,...) but on the webshop all of these different products from the subdomains are sold.
However, for some search terms, some of the subdomains all rank on the first page. For example: https://www.google.com/search?q=successierekenaar&oq=successierekenaar&aqs=chrome.0.69i59j0.3257j0j7&sourceid=chrome&ie=UTF-8
As you can see, the root domain as well as two subdomains and a link to an app, take the first four positions in the SERP.Key question is: if there is a possible search term to rank for, but one of the subdomains already ranks for this term, can it still be used? Otherwise, it won't be easy to find a unique search term with a high enough search volume for each product, since it is a market with very specific products.
On the other hand, if subdirectories were used, it basically comes down to the same: never try to rank two pages for the same search term. -
Also, don't forget to use Google Search Console's "Property Set" feature. However, I think they're about to start auto-created property sets by aggregating subdomains soon anyway: https://www.seroundtable.com/google-search-console-domain-property-26645.html
-
The short answer to your question is: Yes, you should know what keywords each of your subdomains rank for and should adjust strategy accordingly.
The long answer is that I want to see this website because it doesn't sound like something I'd recommend doing in the first place. It used to be that subdomains were treated completely differently from the parent domain and you could, theoretically, take up the entire first page of results with your subdomains. Content mills like About.com took this to the extreme and Google responded so you don't tend to see that happen much anymore. As I understand it, Google also attempts to make a determination as to whether this is the same "site" or multiple, unrelated sites, such as site.blogspot.com subdomains and treats them accordingly.
These days, the general consensus is that you should be using subdirectories/folders instead of subdomains for a variety of reasons, unless the subdomain is for a different site, or something you don't really need to have indexed, like a closed app.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No Control Over Subdomains - What Will the Effect Be?
Hello all, I work for a university and I my small team is responsible for the digital marketing, website, etc. We recently had a big initiative on SEO and generating traffic to our website. The issue I am having is that my department only "owns" the www subdomain. There are lots of other subdomains out there. For example, a specific department can have its own subdomain at department.domain.com and students can have their own webpage at students.domain.com, etc. I know the possibilities of domain cannibilization, but has any one run into long term problems with a similar situation or had success in altering the views of a large organization? If I do get the opportunity to help some of these other domains, what is best to help our overall domain authority? Should the focus be on removing similar content to the www subdomain or cleaning up errors? Some of these subdomains have hundreds of 4XX errors.
Intermediate & Advanced SEO | | Jeff_Bender0 -
How Can A SubDomain Out Perform A Root Domain?
Hi guys! I have a rather strange SEO question. It may not be that strange at all actually. If a site has a subdomain or a shopping cart that is on a subdomain through a third-party shopping cart provider, can the third party shopping cart transfer value to the subdomain causing the subdomain to have greater domain authority than the main site or root domain? Another question, this subdomain, up until yesterday, blocked google from crawling it with robots txt, however it has a much higher domain authority than the root domain. The root domain has a really low domain authority, despite not blocking google from crawling it. How is this possible? I hope these questions make sense. I am a little stumped & trying to figure out why the subdomain is out-performing the main site despite being hidden from search, if that's even the case. Please let me know if I have it all wrong..
Intermediate & Advanced SEO | | Prae0 -
SERP cannibalization
Hi Moz Community, Recently I've been seeing multiple pages from my eCommerce site pop up in the SERPS for a couple of queries. Usually I would count this as a good thing but since both pages that generally pop up are so similar I'm starting to wonder if we would rank better with just one page. My example is the query "birthday gifts" Both of the URL's below show up in the search results one after the other on the first page. The URL on the top is our family page and the one below it is our subcat page, you can find both in the top nav. of our site. www.uncommongoods.com/gifts/birthday-gifts/birthday-gifts (family) www.uncommongoods.com/gifts/birthday-gifts (subcat) Both of these pages have different PA's and the subcat page that currently lives in our site nav is actually: **www.uncommongoods.com/gifts/birthday-**gifts?view=all. ****This url doesn't show up in the serps and is rel=canonicaled to the subcat page without the parameter listed above. We use this page in the nav because we think it's a better user experience than the actual subcat page. If we were to condense all three pages into one would we rank higher? Any thoughts here would be appreciated. Thanks
Intermediate & Advanced SEO | | znotes0 -
Which one is better, a brand new subdomain or a second-level directory with PR 4
Hey, all SEOers! May I ask you a question about subdomain and second-level directory? Our website is about software, so we write many posts about how to use this software solve problems, and then use these posts to get ranks (we don't use the page of software to get ranks). And all the posts we wrote are listed under the second-lever directory, just like: www.xxx.com/support/ . But at this moment our boss want to list all the posts to the subdomain like support.xxx.com. By the way, the second-level directory is a page with PR 4, and the subdomain is brand new, even it doesn't exist now. So here is my question: should we list all the posts to support.xxx.com? If we choose to do like this, this will effect the speed of Google index, and we will take more time to build links for XXX.com and support.XXX.com? Any answer will be appreciated and thank you advance! to get rank instead of ranking the page of product,
Intermediate & Advanced SEO | | Vicky28850 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Blog On Subdomain - Do backlinks to the blog posts on Subdomain count as links for main site?
I want to put blog on my site. The IT department is asking that I use a subdomain (myblog.mysite.com) instead of a subfolder (mysite.com/myblog). I am worried b/c it was my understanding that any links I get to my blog posts (if on subdomain) will not count toward the main site (search engines would view almost as other website). The main purpose of this blog is to attract backlinks. That is why I prefer the subfolder location for the Blog. Can anyone tell me if I am thinking about this right? Another solution I am being offered is to use a reverse proxy. Thoughts? Thank you for your time.
Intermediate & Advanced SEO | | ecerbone0 -
[Need advice!] A particular question about a subdomain to subfolder switch
Hello Moz Community! I really was hoping to get your help on a issue that is bothering me for a while now. I know there is a lot of about this topic but I couldn’t find a good answer for my particular question. We are running several web applications that are similar but are also different from each other. Right now, each one has its own subdomain (which was mainly due to technical reasons). Like this: webapp1.rootdomain.com, webapp2.rootdomain.com etc. Our root domain currently points with 301 to webapp1.rootdomain.com. Now, we are thinking about making two changes: changing to a subfolder level like this: rootdomain.com/webapp1 , rootdomain.com/webapp2 etc. Changing our rootdomain to a landing page (lisitng all the apps) and take out the 301 to webapp1 We want to do these changes mainly for SEO reasons. I know that the advantages are not so clear between subdomain/subfolder but we think it could be the right way to go to push the root domain and profit more from juice passing to the different apps. The problem is that we had a bad experience when we first switched from our first wep app (rootdomain.com) to an subdomain (webapp1.rootdomain.com) to set them equal with the other apps. Our traffic dropped a lot and it took us 6 weeks to get back on the same level as before. Maybe it was the 301 not passing all juice or maybe it was the switch to the subdomain. We are not sure. So, I guess my question is do you think it is the right thing to do for web apps to go with subfolders to pass more juice from root to subfolders? Will it bring again huge drops in traffic once we make that change? Is it worth taking that risk or initial drop because it will pay off in the future? Thanks a lot in advance! Your answers would help me a lot.
Intermediate & Advanced SEO | | ummaterial0 -
Create new subdomain or new site for new Niche Product?
We have an existing large site with strong, relevant traffic, including excellent SEO traffic. The company wants to launch a new business offering, specifically targeted at the "small business" segment. Because the "small business" customer is substantially different from the traditional "large corporation" customer, the company has decided to create a completely independent microsite for the "small business" market. Purely from a Marketing and Communications standpoint, this makes sense. From an SEO perspective, we have 2 options: Create the new "small business" microsite on a subdomain of the existing site, and benefit from the strong domain authority and trust of the existing site. Build the microsite on a separate domain with exact primary keyword match in the domain name. My sense is that option #1 is by far the better option in the short and long run. Am I correct? Thanks in advance!
Intermediate & Advanced SEO | | axelk0