Is our Third Party Subdomain hurting our SERPs?
-
Hello!
Our Moz report under the root domain godelta.com displays 696 high priority issues that we cannot control that are all caused by a third party subdomain. promotionalproducts.godelta.com
We don’t have any control of the SEO on the third party website. Our blog posts link to the third party subdomain from our blog subdomain. blog.godelta.com
Is the third party subdomain affecting our SERP and should we replace the subdomain with its own domain name?
Hopefully we can clear this up and end the debate with our internal team and our HubSpot account manager.
David
-
Thank you for the answer. We have moved forward removing the sub domain and will keep you posted on our SERP progress
Thanks
-
Hi David,
Yes it could be impacting your SERP but without seeing what the issues are it's very hard to give full input.
If your building a new site on hubspot maybe it be a good idea to move everything onto the main domain.
It's better to control everything which is linked to your domain.
-
Hi David
Probably not the answer you want but... it could be.
It's almost impossible to say for sure without doing a full audit of the site to see exactly what the issues are with the subdomain.
Danny
-
Aaaron,
The subdomain promotionalproducts.godelta.com is a third party shopping cart website for our promotional products business that requires its own domain or a subdomain.
When we began using HubSpot we created a subdomain for our blog as it is hosted on the HubSpot COS and our website is Wordpress. We are in the process of developing our website on the HubSpot COS.
The question I am looking for an answer is the third party subdomain hurting our SERP based on the MOZ report that shows 696 high priority issues?
-
Hi David,
Can I ask why you have third party subdomain?
I always believe having everything clean is the best way to go. So try moving everything under a one domain that you control. This way you can get the best out outcomes when it comes to SERP. I know for us we always stay away from Subdomains and use Sub-folders for blogs and so on.
Here is a good link from Moz which talks about best practice.
https://moz.com/learn/seo/domain
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Anyone come accross a completely random site name displayed in the SERPs?
Hello SEO friends, I am looking at a site that has a completely random site name displayed in the serps. The site in question is a hypnotherapy service and the site name is displaying as a commercial cargo airline! See the screen shot. Any ideas? There is 'local business' structured data on the site and it IS showing the correct business name. Is this a hack? Negative SEO attack? Random Google stuff up? Help! Very confused! Thank you for any advice! Sean!Untitled.jpg
Intermediate & Advanced SEO | | sean.e0 -
Subdomain vs totally new domain
Problem: Our organization publish maps for public viewing using google maps. We are currently getting limited value from these links. We need to separate our public and private maps for infrastructure purposes, and are weighing up the strengths and weaknesses of separating by domain or sub domain with regards SEO and infrastructure. Current situation: maps.mycompany.com currently has a page authority of 30 and mycompany.com has a domain authority of 39. We are currently only getting links from 8 maps which are shared via social media whereas most people embed our maps on their website using an iframe which I believe doesn't do us any favour with SEO. We currently have approx 3K public maps. Question: What SEO impact can you see if we move our public maps from the subdomain maps.mycompany.com to mycompanypublicmaps.com? Thanks in advance for your help and happy to give more info if you need it!
Intermediate & Advanced SEO | | eSpatial0 -
How much SEO damage would it do having a subdomain site rather directory site?
Hi all! With a coleague we were arguing about what is better: Having a subdomain or a directory.
Intermediate & Advanced SEO | | Gaston Riera
Let me explain some more, this is about the cases: Having a multi-language site: Where en.domain.com or es.domain.com rather than domain.com/en/ or domain.com/es/ Having a Mobile and desktop version: m.domain.com or domain.com rather than domain.com/m or just domain.com. Having multiple location websites, you might figure. The dicussion started with me saying: Its better to have a directory site.
And my coleague said: Its better to have a subdomain site. Some of the reasons that he said is that big companies (such as wordpress) are doing that. And that's better for the business.
My reasons are fully based on this post from Rand Fishkin: Subdomains vs. Subfolders, Rel Canonical vs. 301, and How to Structure Links for SEO - Whiteboard Friday So, what does the community have to say about this?
Who should win this argue? GR.0 -
Trying to find example of in app indexing in SERPs
My colleague who is a developer is trying to find an example of in apps being indexed in the SERPs. Does anybody know of any examples? Thanks+
Intermediate & Advanced SEO | | RosemaryB0 -
Does subdomain hurt SEO on main site
This client sells event management software and puts all their clients on different subdomains of their main domain. Looking in SEO tools like OSE, when I run a backlink analysis, it pulls up all the backlinks to the subdomains as well as those for the main domain. In webmaster tools when I look at queries, impressions and clicks, they get at least 30 times more traffic and impressions on keywords found in their subdomains and very few on their own. In other words, all these tools are providing a collective analysis of main domain and all subdomains. All the backlinks and keywords recorded for those subdomains are not at all relevent to the keywords they want to rank for. For example, their software supports Boy Scouts, so keywords they rank for according to WT include merit badge, scout camp, etc., but of course, that's on the subdomain. As a result, if you were to take a snapshot of their online presence as these tools do, you would think they were a boy scout website and not a software developer if you include the subdomain, along with its PR, backlinks, keywords, etc. So the question I have is, does Google connect all these subdomains with the main domain and then water down the main site with irrelevant keywords, content and backlinks? Or does Google see all those subdomains as completely separate and we don't need to worry or move their clients off their subdomain? I'm worried about Google assigning a "boy scout" relevancy to them. Am I wrong? What would you do?
Intermediate & Advanced SEO | | katandmouse0 -
Is SEOmoz.org creating duplicate content with their CDN subdomain?
Example URL: http://cdn.seomoz.org/q/help-with-getting-no-conversions Canonical is a RELATIVE link, should be an absolute link pointing to main domain: http://www.seomoz.org/q/help-with-getting-no-conversions <link href='[/q/help-with-getting-no-conversions](view-source:http://cdn.seomoz.org/q/help-with-getting-no-conversions)' rel='<a class="attribute-value">canonical</a>' /> 13,400 pages indexed in Google under cdn subdomain go to google > site:http://cdn.seomoz.org https://www.google.com/#hl=en&output=search&sclient=psy-ab&q=site:http%3A%2F%2Fcdn.seomoz.org%2F&oq=site:http%3A%2F%2Fcdn.seomoz.org%2F&gs_l=hp.2...986.6227.0.6258.28.14.0.0.0.5.344.3526.2-10j2.12.0.les%3B..0.0...1c.Uprw7ko7jnU&pbx=1&bav=on.2,or.r_gc.r_pw.r_cp.r_qf.&fp=97577626a0fb6a97&biw=1920&bih=936
Intermediate & Advanced SEO | | irvingw1 -
NOINDEX content still showing in SERPS after 2 months
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content: name="robots" content="NOINDEX" /> It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options. 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right? Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden. Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code. Please advise and thanks for reading.
Intermediate & Advanced SEO | | NormanNewsome0