Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Old subdomains - what to do SEO-wise?
-
Hello,
I wanted the community's advice on how to handle old subdomains.
We have https://www.yoursite.org. We also have two subdomains directly related to the main website: https://www.archive.yoursite.org and https://www.blog.yoursite.org.
As these pages are not actively updated, they are triggering lots and lots of errors in the site crawl (missing meta descriptions, and much much more). We do not have particular intentions of keeping them up-to-date in terms of SEO. What do you guys think is the best option of handling these?
I considered de-indexing, but content of these page is still relevant and may be useful - yet it is not up to date and it will never be anymore.
Many thanks in advance.
-
Thanks for replying Will.
You have mentioned a few ways to deal with this there - and they all seem to point out to the fact that this should not really be a high-priority issue for us at the moment. Especially, if you think that sub-domains do not really have a major effect to the main site (I would not even think it's even worth us deindexing to be honest as it may be relevant to some people and we can just allow Google to continue indexing as it is).
Surely, all considerations point to this: we can come to the conclusion that we won't be doing any SEO-related work on these pages.
Therefore, how do I set up MOZ to ignore these two sub-domains and only show crawl errors related to the main site? We just don't want these pages to be crawled at all by MOZ given we won't do any work on them.
Thanks
-
Hi there. Sorry for the slow follow-up on this - there was an issue that meant I didn't get the email alert when it was assigned to me.
There is increasing evidence that culling old / poor performing content from your site can have a positive effect, though I wouldn't be particularly confident that this would transfer across sub-domains to benefit the main site.
In general, I suspect that most effort expended here will be better-placed elsewhere, and so I would angle towards the least effort option.
I think that the "rightest" long-term answer though would be to move the best content to the main domain (with accompanying 301 redirects) and remove the remainder with 410 status codes. This should enable you to focus on the most valuable content and get the most benefit from the stuff that is valuable, while avoiding having to continue expending effort on the stuff that is no longer useful. The harder this is, though, the less I'd be inclined to do it - and would be more likely to consider just deindexing the lowest quality stuff and getting whatever benefit remains from the better content for as long as it is a net positive, with an eye to eventually removing it all.
Hope that helps - I don't think it's a super clear-cut situation unfortunately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitewide nav linking from subdomain to main domain
I'm working on a site that was heavily impacted by the September core update. You can see in the attached image the overall downturn in organic in 2019 with a larger hit in September bringing Google Organic traffic down around 50%. There are many concerning incoming links from 50-100 obviously spammy porn-related websites to just plain old unnatural links. There was no effort to purchase any links so it's unclear how these are created. There are also 1,000s of incoming external links (most without no-follow and similar/same anchor text) from yellowpages.com. I'm trying to get this fixed with them and have added it to the disavow in the meantime. I'm focusing on internal links as well with a more specific question: If I have a sitewide header on a blog located at blog.domain.com that has links to various sections on domain.com without no-follow tags, is this a possible source of the traffic drops and algorithm impact? The header with these links is on every page of the blog on the previously mentioned subdomain. **More generally, any advice as to how to turn this around? ** The website is in the travel vertical. 90BJKyc
White Hat / Black Hat SEO | | ShawnW0 -
Opinion on Gotch SEO methods & services
I would love to get you all's take on Gotch SEO. I am gearing up to link build for a site in the next several months, and have been reading up from sources other than Moz, in preparation. (Need to re-read Moz's guide, too, but I have already read it last year) I'm reading Gotch SEO's main link building method articles right now, and am wondering what you all think. Do you think they have a good approach and are generally reliable? Likewise, has anyone used their service for getting a link? What was your experience? Or if you haven't used the service, any quick takes on it?
White Hat / Black Hat SEO | | scienceisrad0 -
Bad for SEO to have two very similar websites on the same server?
Is it bad for SEO to have two very similar sites on the same server? What's the best way to set this up?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Can I 301 redirect old URLs to staging URLs (ex. staging.newdomain.com) for testing?
I will temporarily remove a few pages from my old website and redirect them to a new domain but in staging domain. Once the redirection is successful, I will remove the redirection rules in my .htaccess and get the removed pages back to live. Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
White Hat / Black Hat SEO | | jef22200 -
Separate domain name for a subdomain?
I just created a subdomain to help our main TLD website. I was wondering if it's smart to create a separate TLD for this subdomain and set up a forward and build links to it. Reason I was thinking about it because it would be easier for people to remember instead of typing in subdomain.maindomain.com. But, I don't want the main website to suffer, since the purpose of creating this subdomain and it's content is to help the main domain. Any inputs on this? Thank you.
White Hat / Black Hat SEO | | FinanceSite0 -
How Is Your Approach Towards Adult SEO?
I would like to know how SEOMoz community members approach adult SEO. How do you approach a project when you get one (if you do it that is). If you dont do adult SEO, why do you not do it? Is it because it's much more difficult than normal SEO or do you not want to associate yourself with that industry?
White Hat / Black Hat SEO | | ConversionChamp0