What to do about old urls that don't logically 301 redirect to current site?
-
Mozzers,
I have changed my site url structure several times.
As a result, I now have a lot of old URLs that don't really logically redirect to anything in the current site.
I started out 404-ing them, but it seemed like Google was penalizing my crawl rate AND it wasn't removing them from the index after being crawled several times. There are way too many (>100k) to use the URL removal tool even at a directory level.
So instead I took some advice and changed them to 200, but with a "noindex" meta tag and set them to not render any content. I get less errors but I now have a lot of pages that do this.
Should I (a) just 404 them and wait for Google to remove (b) keep the 200, noindex or (c) are there other things I can do? 410 maybe?
Thanks!
-
"So instead I took some advice and changed them to 200, but with a "noindex" meta tag and set them to not render any content. I get less errors but I now have a lot of pages that do this."
I would not recommend keeping it that way. You could mass redirect them to the sitemap page if they are passing PR and or some traffic, and there is no logical other place to point them.
404's are not really something that can hurt you, providing that they are coming from external sources and you aren't providing 404 links on your site to dead pages on your site, if there are these, then you should fix the internal links at the source.
-
I dont think 404 errors hurt your site. If you have that many pages, they are most likely crawling your site a lot anyway. Have you set your crawl frequency in your sitemap? On bigger sites that get frequent updates, we set the crawl frequency to daily rather than weekly.
If possible, try to see if there are any top level items you can submit a URL removal request for. Hopefully this can speed up the process fo getting the URL's removed. This process can take a long time for Google to take care of. After changing websites we still had 404 errors after 6 months, even after submitting the URL removal request.
Another option is to have the page render a 410 rather than a 404. A 410 states to the search engine the page is gone, and will not be coming back. If you are using some form of cart system or cms there might be a way to apply the code to a large number of pages at once, rather than trying to manually code 100k pages.
"410 Gone
The requested resource is no longer available at the server and no forwarding address is known. This condition is expected to be considered permanent. Clients with link editing capabilities SHOULD delete references to the Request-URI after user approval. If the server does not know–or has no facility to determine–whether or not the condition is permanent, the status code 404 (Not Found) should be used instead of 410 (Gone). This response is cacheable unless indicated otherwise."Worse case scenero, you could set them to no-index, or just leave them be. Even if they dont lead anywhere logically, they could still bring you traffic. Or redirect them to the closest thing that is on the site currently.
-
JC,
When you say ...started out 404-ing them...seemed like Google was penalizing my crawl rate..... etc. I have not seen where Google even algorithmically had any real issues with 404's. I your site has 500K pages and 100K are 404'd I do not think it would be a problem for Google per se. (You might have a searcher problem if these were pages that were bookmarked, lots of links, etc.) My caution would be that if you have a lot of pages on the site with links that still go to the 404 pages you could run into UX issues.
For me, I would go with the 404's. I think they will get removed over time.Best
-
When necessary, redirect relevant pages to closely related URLs. Category pages are better than a general homepage.
If the page is no longer relevant, receives little traffic, and a better page does not exist, it’s often perfectly okay to serve a 404 or 410 status codes.
-
You could redirect them to something even remotely relevant even if its the homepage at the end of the day. What ever you do it going to take time and it's going to give you some sort of headache.
What would best suit a user who might land on an old link or somehow get to the page? That would be the best way to find a solution. A good soft 404 or redirect tends to help here.
Best of luck though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing URLS: from a short well optimised URL to a longer one – What's the traffic risk
I'm working with a client who has a website that is relatively well optimised, thought it has a pretty flat structure and a lot of top level pages. They've invested in their content over the years and managed to rank well for key search terms. They're currently in the process of changing CMS and as a result of new folder structuring in the CMS the URLs for some pages look to have significantly changed. E.g Existing URL is: website.com/grampians-luxury-accommodation which ranked quite well for luxury accommodation grampians New URL when site is launched on new CMS would be website.com/destinations/victoria/grampians My feeling is that the client is going to lose out on a bit of traffic as a result of this. I'm looking for information or ways or case studies to demonstrate the degree of risk, and to help make a recommendation to mitigate risk.
Intermediate & Advanced SEO | | moge0 -
Found a cache of old domain names, should I link or 301 redirect
We have found a cache of about 10 URLs, some are ranking above our main URL in Google SERPS. What is the best course of action here? a. Redirect all to the homepage?
Intermediate & Advanced SEO | | moconn
b. Link all domains to the homepage?
c. Link all domains to select pages on on main site, being careful not to anchor text spam
d. 301 redirect all to the main site. Is there any disadvantage to your recommendation? Is there likely to be a penalty incurred? I feel like we'll get the strongest increase in rankings by following option c but it feels like option d may be safer. Thanks in advance for your help!0 -
EComm Sites that Don't Display Pricing
I've got a client that only shows pricing if a user is logged in - they're B2B and only sell at a wholesale level. The site is massive, has been around for about a decade, and has had an active SEO campaign for years. They've been losing ground on top ranked keywords, primarily in the 1-2 spots, rest of the first page remains strong and actually improves regularly.My hunch is that Google recognizes the inability for anyone to make a purchase on the site. As a result, they're realizing that the searcher intent doesn't match the actions that can be taken on the site and are bumping them down. Has anyone seen a similar situation or have any evidence to suggest my hunch is correct?
Intermediate & Advanced SEO | | LoganRay0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Ecommerce URL's
I'm a bit divided about the URL structure for ecommerce sites. I'm using Magento and I have Canonical URLs plugin installed. My question is about the URL structure and length. 1st Way: If I set up Product to have categories in the URL it will appear like this mysite.com/category/subcategory/product/ - and while the product can be in multiple places , the Canonical URL can be either short or long. The advantage of having this URL is that it shows all the categories in the breadcrumbs ( and a whole lot more links over the site ) . The disadvantage is the URL Length 2nd Way: Setting up the product to have no category in the URL URL will be mysite.com/product/ Advantage: short URL. disadvantage - doesn't show the categories in the breadcrumbs if you link direct. Thoughts?
Intermediate & Advanced SEO | | s_EOgi_Bear1 -
Language Detection redirect: 301 or 302?
We have a site offering a voip app in 4 languages. Users are currently 302 redirected from the root page to /language subpages, depending on their browser language. Discussions about the sense of this aside: Is it correct to use a 302 redirect here or should users be 301 redirected to their respective languages? I don't find any guideline on this whatsoever...
Intermediate & Advanced SEO | | zeepartner1 -
What if a 301 redirect is removed?
Suppose the following scenarios after a 301 redirects from source URL to targent URL is removed. 1. If source URL raises a 404 error, will target URL retained the link juice previously passed from source URL? 2. If source URL starts to show different content than what is showing on target URL, will the previously passed link juice be credited back to the source URL?
Intermediate & Advanced SEO | | Bull1350 -
Multiple, Partial Redirecting URLs from old SEO company
Received quite a surprise when I gained access to the Google webmaster account and saw 4 domains that are link to my clients domain and the number of links for each domain range between 10,000 and 90,000. Come to find out this was a result of their former agency. The business is very local central. I will use the example of a burger place. They main site is burgers.com and burger places are listed by city and state. Their former agency bought several domains like californiaburgers.com and duplicated the listings for that state on this domain. You can view certain pages of the second domain, but the home page is redirected as are most of the city pages with 301s to the main burgers.com domain. However, there are pages on the additional domains that do not redirect, as they are not duplicated on the main domain so nowhere to redirect. Google has only found four of them but looks like there could be at least 50. Pages that are not redirected are indexed by the engines - but not ranking (at least not well). There is a duplicate content issue, although "limited" in the sense that it really is just the name of the business, address and phone number - there is not much to these listings. What is the best approach to overcome? Right now GWT is showing over 300,000 links, however at least 150,000 to 200,000 of that is from these domains.
Intermediate & Advanced SEO | | LeverSEO0