Are 17000+ Not Found (404) Pages OK?
-
Very soon, our website will go a rapid change which would result in us removing 95% or more old pages (Right now, our site has around 18000 pages indexed).
It's changing into something different (B2B from B2C) and hence our site design, content etc would change.
Even our blog section would have more than 90% of the content removed.
What would be the ideal scenario be?
- Remove all pages and let those links be 404 pages
- Remove all pages and 301 redirect them to the home page
- Remove all unwanted pages and 301 redirect them to a separate page explaining the change (Although it wouldn't be that relevant since our audience has completely changed)- I doubt it would be ideal since at some point, we'd need ot remove this page as well and again do another redirection
-
Mohit,
Tom's advice will help you determine which pages are worth redirecting and which should just go to a 404 page (which should be customized instead of the browser/host default, and should also return a 404 response code in the http header!). My guess is that pages with links only from scraper sites aren't going to pass the tests laid out by Tom and thus would just go to a 404 page. However, any that have decent external links would fit the criteria and would be candidates for a 301 redirect.
-
Just to add a little to this great reply...
Here is how I would determine if it was worth my time to keep some of the old pages.
If the industry is the same but the end user is different, I would make EVERY attempt to keep those old pages. AuthorRank will matter in the future if you can contribute that information into a particular rel=publisher then I think it will be totally worth the time.
If, however, the information has nothing to do with the industry, then I wouldn't even consider taking the time to figure all of this out. I would have a kick ass 404 page to help people find your new stuff though.
Remember too that when you 301 redirect you do in fact loose some "link juice". (I really hate that phrase) So if the incoming links are of little to now value then a 301 will provide even less.
-
Hi Tom.. Thank you for your advice.
The thing is, we don't want to retain the users. They are not going to serve our cause anymore (We used to spend thousands of dollars every month on server costs just to keep up with teh load. now we are cutting it down- so unwanted users are not really something we want as it would result in load increase)
I'll surely follow your advice on OSE. The thing is, we have lot of link to the pages from scraper sites. I am not sure if it's worth keeping though.
-
Hi there
17,000 is quite a lot. I would look at maybe redirecting some of the URLs and I would do this based on certain criteria.
First of all, it helps to have a complete list of your current URLs. Screaming Frog is a great tool for this and is free.
Once you have your URLs, go into your analytics data and see which pages are attracting users. Take a sample size of about 2-3 months. If you're using Google analytics, click on traffic sources -> sources -> all traffic on the left-hand side.
When the dashboard loads, next to the "Primary Dimension" click other, and from the drop down menu click traffic sources, then landing page.
Any page with more than 5 or 10 visitors could be one worth redirecting. If these are pages that visitors might frequently use to get to your site, ensuring they are redirected might help to not interrupt their user journey. A 404 might put them off and go elsewhere.
Next, I'd look at what pages you might want to save to keep your SEO "strength". Put your URL into OpenSiteExplorer and then once done, click on "top pages". We're interested in the "Inbound Links" column here. Export the file into a CSV then sort the URL list in Excel by the Inbound Link total. You can filter here the pages with less links, so for instance you could remove the pages with 3 inbound links or less. It's a general way of doing things and isn't foolproof, but you will be left with a list of pages that could be getting decent PageRank/link equity. Manually check those pages and their backlinks and if you think they're acceptable, make sure you put in a 301 redirect.
Anything that doesn't match either of these criteria I would leave for a 404. You may be left with a lot, but Google knows that 404s are an accepted part of the course and won't penalise you for them. Check out this webmasters blog link.
Hope this helps with your decision making!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate pages and Canonicals
Hi all, Our website has more than 30 pages which are duplicates. So canonicals have been deployed to show up only 10 of these pages. Do more of these pages impact rankings? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
May integrating my main category page in the index page improve my ranking of main category keyword?
90% of our sales are made with products in one of our product categories.
Intermediate & Advanced SEO | | lcourse
A search for main category keyword returns our root domain index page in google, not the category page.
I was wondering whether integrating the complete main category directly in the index page of the root domain and this way including much more relevant content for this main category keyword may have a positive impact on our google ranking for the main category keyword. Any thoughts?1 -
Product Page rankings - How to boost?
Hi folks I am responsible for an e-commerce website. Our website is doing very well but I believe that our product pages should be ranking more highly than they currently are. When taking over my current role, it became clear that a number of changes would need to be made to try and boost the under performing product pages. Amongst other things I therefore implemented the following: New Product content - we have placed a massive focus on reworking all product content so that it is unique and offers value to the reader. The new content includes videos, images and text that is all keyword rich but (I hope) not seen as overly spammy. Duplicate content - the CMS was creating multiple versions of the same page - I addressed this by implementing 301 redirects and adding canonical links. This ensures there is now only 1 version of the page Parameters - I instructed Google to not index certain URLs containing specific parameters Internal links - I have tried to improve the number of links to the products from relevant key category pages My question is, although some of the changes have only been in place for a month, what else can I do to ensure that the product pages rank as highly as possible. As an e-commerce website with so many products it is very difficult to link to these product pages directly, so any tips or suggestions would be welcome! Here's an example of a product page link : http://www.directheatingsupplies.co.uk/pid_37440/100180/Worcester-Greenstar-29CDi-Classic-Gas-Combi-Boiler-7738100216-29-Cdi.aspx
Intermediate & Advanced SEO | | DHS_SH0 -
Tips for improving this page
I have made a content placeholder for a keyword that will gain significant search volume in the future. Until then I am trying to optimize the page to rank when the game launches and the keyword gains volume. http://hiddentriforce.com/a-link-between-worlds/walkthrough/ Is there anything I can do to improve the optimization for the phrase 'a link between worlds walkthrough' A lot of my competitors are already setting up similar placeholder pages and doing the same thing. I have 2 fairly large gaming sites that will place a banner for my walkthrough on their site. I did not pay for the links. I do free writing/ other services in exchange for this. I have been sharing the link socially. It has almost 200 likes and a handful of shares, tweets, g+ votes
Intermediate & Advanced SEO | | Atomicx0 -
Why Is This Page Not Ranking?
Hi Mozzers, I can't rank (the page is nowhere on the Google grid that I can find) and I've not been able to move the needle at all on it. The page is http://www.lumber2.com/Western-Saddle-Pads-s/98.htm for keyword "western saddle pads." I'm inclined to think I'm cannabalizing the category with the products so I removed the word saddle from the majority of the product names on page. However, saddle pad or saddle pads is in the meta title for most if not all of the products. Do you think I'm cannabalizing with the product titles or is there something else going on? Thanks for any help.
Intermediate & Advanced SEO | | AWCthreads0 -
Are these doorway pages?
I've added category pages for counties/town on http://www.top-10-dating-reviews.com but will google see these as doorway pages? If you click on categories from the menu at the top and view some of the pages you'll hopefully see what I mean? Should I continue building these or delete them? Any advice appreciated.
Intermediate & Advanced SEO | | SamCUK0 -
Pricing Page vs. No Pricing Page
There are many SEO sites out there that have an SEO Pricing page, IMO this is BS. A SEO company cannot give every person the same quote for diffirent keywords. However, this is something we are currently debating. I don't want a pricing page, because it's a page full of lies. My coworker thinks it is a good idea, and that users look for a pricing page. Suggestions? If I had to build one (which I am debating against) is it better to just explain why pricing can be tricky? or to BS them like most sites do?
Intermediate & Advanced SEO | | SEODinosaur0 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0