301 vs 410 for subdirectory that was moved to a new domain, 2-years later
-
Hi all,
I've read a lot about 301 vs 404 and 410s, but the case is pretty unique so I decided to get some feedback from you.
Both websites are travel related but we had one destination as a subdirectory of the other one (two neighboring countries, where more than 90% of business was related to the 'main' destination and the rest to the 'satellite'). This was obviously bad practice and we decided to move the satellite destination to its own domain. Everything was done 2 years ago and we opted for 301s to the new domain as we had some good links pointing to satellite content. (All of the moved content is destination specific and still relevant)
Few weeks back we figured out that google still shows our subdirectory when doing specific 'site:' search and looking further into it, we realized we still get traffic for satellite destination through the main website via links acquired before the move. Not a lot of hits, but they still sporadically occur. A decision was made (rather hastily) to 410 pages and see if that will make satellite subdir pages not show in google searches. So 3 weeks in, 410 errors are climbing in GWMT, but satellite subdirectory still shows in google searches. One part of the team is pushing to put back in place 301s. The other part of the team is concerned with the 'health' of the main website as those pages are not relevant for it, and want them gone .
What would you do?
-
Google is adding and removing URLs from its index fairly slowly right now, and it's not uncommon for changes to take several weeks to filter up into the index, especially for site: searches. This is very annoying (even more so for people who are trying to launch brand-new sites), but not a huge deal since, to Laura's point, these URLs are most likely not showing up for any searches, they just haven't filtered out of the index. I would give it another week or two and see what happens. You may also want to do a Fetch+Submit in Search Console for a few of the subdirectory URLs, to make sure that Google revisits them and registers that they are 410s now - if they've been redirecting for 2 years, Google may just not be crawling them that frequently.
-
Hi Laura,
Thank you for your feedback. This was a problem for the first couple of months, but we don't have the issue of the main domain showing up in SERPs anymore.
What is interesting is that although we applied 410 to these pages three weeks ago--they still show.
-
Just because Google shows the sub URL in a "site:sub.example.com'" search does not necessarily mean it shows the sub URL for natural searches. Do you have evidence that Google shows the sub page for normal searches? You may be making an issue out of something that isn't much of an issue.
I suspect you had it right the first time with the 301-redirects from main to satellite pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
1 site on 2 domains (interesting situation, expert advice needed)
Dear all, i have read many posts about having one content on 2 different domains, how to combine those two to avoid duplicate content. However the story of my two domains makes this question really difficult. Domain 1: chillispot.org ( http://www.opensiteexplorer.org/links?site=chillispot.org ) The original site was on this domain, started 9 years ago. That time the owner of the domain was not me. The site was very popular with lots of links to it. Then after 5 years of operation, the site closed. I have managed to save the content to: Domain 2: chillispot.info ( http://www.opensiteexplorer.org/links?site=chillispot.info ) The content i put there was basically the same. Many links were changed to chillispot.info on external sites when they noticed the change. But lots of links are still unchanged and pointing to .ord domain. The .info is doing well in search engines (for example for keyword 'chillispot'). Now i managed to buy the original chillispot.org domain. As you can see the domain authority of the .org domain is still higher than the .info one and it has more valuable links. Question is: what would be the best approach to offer content on both domains without having penalized by google for duplicated content? Which domain should we keep the content on? The original .org one, which is still a better domain but not working for several years or the .info one who has the content for several years now and doing well on search engines? And then, after we decide this, what would be the best approach to send users to the real content? Thanks for the answers!
Intermediate & Advanced SEO | | Fudge0 -
Mass 301 redirect from a sub-domain - using Joomla or htaccess
How is best to mass redirect old domains - Listing the URL's in htaccess? We are looking to use Joomla as a CMS - transferring a blog from a sub-domain to the main site and want to 301 all the sub domain blog posts - any ideas?
Intermediate & Advanced SEO | | JohnW-UK0 -
How to handle the 301 of a complete domain on URL level
We will be shutting down an old website with many (good) links, since the site has no strategic relevance anymore. We do have many other sites, but none of them has exactly the same content/topic. Nonetheless, I would like to keep the juice and redirect the site to another newer project. However, I want to redirect certain URLs of the old site to probably even different domains, depending on which content matches best with the alternative newer site. Does this make sense? Or would youjust redirect the whole domain to one other domain although they don't really have the same topic And how would you handle the URL redirects if the old site has more than 50k URLs? Because that is the case. Thanks for any advice
Intermediate & Advanced SEO | | Windex0 -
Should I buy a .co domain if my preferred .com and .co.uk domain are taken by other companies?
I'm looking to boost my website ranking and drive more traffic to it using a keyword rich domain name. I want to have my nearest city followed by the keyword "seo" in the domain name but the .co.uk and .com have already been taken. Should I take the plunge and buy .co at a higher price? What options do I have? Also whilst we're on domains and URL's is it best to separate keywords in url's with a (_) or a (-)? Many thanks for any help with this matter. Alex
Intermediate & Advanced SEO | | SeoSheikh0 -
I run an (unusual) clothing company. And I'm about to set up a version of our existing site for kids. Should I use a different domain? Or keep the current root domain?
Hello. I have a burning question which I have been trying to answer for a while. I keep getting conflicting answers and I could really do with your help. I currently run an animal fancy dress (onesie) company in the UK called Kigu through the domain www.kigu.co.uk. We're the exclusive distributor for a supplier of Japanese animal costumes and we've been selling directly through this domain for about 3 years. We rank well across most of our key words and get about 2000 hits each day. We're about to start selling a Kids range - miniature versions of the same costumes. We're planning on doing this through a different domain which is currently live - www.kigu-kids.co.uk. It' been live for about 3-4 weeks. The idea behind keeping them on separate domains is that it is a different target market and we could promote the Kids site separately without having to bring people through the adult site. We want to keep the adult site (or at least the homepage) relatively free from anything kiddy as we promote fancy dress events in nightclubs and at festivals for over 18s (don't worry, nothing kinky) and we wouldn't want to confuse that message. I've since been advised by an expert in the field that that we should set up a redirect from www.kigu-kids.co.uk and house the kids website under www.kigu.co.uk/kids as this will be better from an SEO perspective and if we don't we'll only be competing with ourselves. Are we making a big mistake by not using the same root domain for both thus getting the most of the link juice for the kids site? And if we do decide to switch to have the domain as www.kigu.co.uk/kids, is it a mistake to still promote the www.kigu-kids.co.uk (redirecting) as our domain online? Would these be wasted links? Or would we still see the benefit? Is it better to combine or is two websites better than one? Any help and advice would be much appreciated. Tom.
Intermediate & Advanced SEO | | KIGUCREW0 -
7 years old domain sandboxed for 8 months, wait or make a domain change?
Hello folks The questions is, if a domain, 7 years old being sandboxed due to "notice of unnatural links to website" does it make sense to make a domain change (301 permanent redirect and make a "domain change" under google webmaster tools) to another, aged(!) domain name?
Intermediate & Advanced SEO | | Ferray
Website being sandboxed for over 8 months already and there is no chance to do anything with those "unnatural" links to website... Any suggestions?0 -
301 - do i change old links once 301 is in place?
Hey all, I'm about to setup a 301 on a website that has pretty good SEO rankings and I have the ability to change all the old inbound links that point to the old site, to the new site - should I leave them pointing to the old site that has the 301 on it or change all the old inbound links to the new domain name? Which has better SEO value? Thanks for helping, Anthony
Intermediate & Advanced SEO | | Grenadi0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0