Increase in 404's
-
Hi,
We recently did a website upgrade and as a result url structure changed. There is 140 404's in Google webmaster. Just wondering on best best practice.
-
What should be done with pages that are defiantly not being used any more? For example we used to how .com/why-choose-us* this section is gone for Good - Should I redirect to closest page on site such as .com/about or noindex all together?
-
We also had lots of pages that were on a similar topic but now have condensed these into one or two pages on the new site. Is it OK to 301redirect 10+ 404's to one page on the new site. These will only be redirected to closely related topics so from a user point of view the new presented content will be fine. (Is this bad from a search engine point of view as in pointing lots of old 404's to one particualr page on new site)
Your input is welcomed.
Thanks,
-
-
Ok thanks guys for your input.
As per your suggestion, yes I could redirect multiple 404's to one page on the new site as the content would be relevant to the end user. As per pages that have no relevancy any-more I think I will just allow them to drop from the index.
Cheers for the link also.
Regards,
Glen
-
A one-to-one redirect is usually best but as long as where you redirect to is the most relevant alternative then you are fine. If the original page that is now 404'd was receiving essentially no traffic and wasn't ranking then you can likely let the 404 stick and it will drop from the index (assuming it was there already). If one new page is relevant for 2, 3, 4+ older pages and there's nowhere more relevant to redirect them to, then it is perfectly fine to 301 all of those to the same place. What you don't want to do is blindly decide to bulk redirect everything one level up or to your homepage without doing your research first. You want to make sure that the new URL you're pointing to will serve your customers/visitors as best as it can.
And as Simon said in his response, Cyrus' post on redirects is a great resource for answers on what you should, or shouldn't, be doing.
-
Redirecting multiple pages to one page is ok so long as there is relevance, after all you want to send your users to somewhere that is of use to them. What you don't want to do is just point all 404s to one page such as your home page.
I think the whole redirect issue was expertly covered by Cyrus recently in this great blog post.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to create a 301 redirect on a WordPress site, nothing's working...
Hello all, I'm hoping someone out there can give me a hand with this. I'm trying to modify my .htaccess file so that the site will go from maxcarnagemusic.com to www.maxcarnagemusic.com and also, so viewers will be redirected to www.maxcarnagemusic.com/home when they try to access the site. I've tried a few different things, including adding the 301 redirect plugin for Wordpress, but nothing seems to work. Can someone out there show/tell me how to create an htaccess file that will execute as much. I apologize in advance, my Apache experience is very, very limited. Thank you all in advance!
Web Design | | maxcarnage0 -
404's and a drop in Rank - Site maps? Data Highlighter?
I managed an old (2006 design) ticket site that was hosted and run by the same company that handled our point of sale. (Think, really crappy, customer had to click through three pages to get to the tickets, etc.) In Mid February, we migrated that old site to a new, more powerful site, built by a company that handles sites exclusively for ticket brokers. (My site: TheTicketKing. - dot - com) Before migration, I set up 301's for all the pages that we had currently ranked for, and had inbound links pointing to, etc. The CMS allowed me to set every one of those landing pages up with fresh content, so I created unique content for all of them, ran them through the Moz grader before launch, etc. We launched the site in Mid February, and it seemed like Google responded well. All the pages that we had 301's set up for stayed up fairly well in rank, and some even reached higher positions, while some took a few weeks to get back up to where they were before. Google was also giving us an average of 8-10K impressions per day, compared to 3000 per day with the old site. I started to notice a slow drop in impressions in mid April (after two months of love from Google,) and we lost rank on all our non branded pages around 4/23. Our branded terms are still fine, we didn't get a message from Google, and I reached out to the company that manages our site, asking if they had any issues with their other clients. They suggested that I resubmit our sitemaps. I did, and saw everything bump back up (impressions and rank) for just one week. Now we're back in the basement with all the non branded terms once again. I realize that Google could have penalized us without giving us a message, but what got me somewhat optimistic was the fact that resubmitting our sitemaps did bring us back up for around a week. One other thing that I was working on with the site just before the drop was Google's data highlighter. I submitted a set of pages that now come back with errors, after Google seemed to be fine with the data set before I submitted it. So now I'm looking at over 300 data highlighter errors when I'm in WMT. I deleted that set, but I still get the error listings in WMT, as if Google is still trying to understand those pages. Would that have an effect on our rank? Finally I do see that our 404's have risen steadily since the migration, to over 1000 now, and the people who manage the CMS tell me that it would have no effect on rank overall. And we're going to continue to get 404's as the nature of a ticket site would dictate? (Not sure on that, but that's what I was told.) Would anyone care to chime in on these thoughts, or any other clues as to my drop?
Web Design | | Ticket_King0 -
My 404 page is showing a 4xx error. How can that be fixed?
My actual 404 page is giving a 4xx error.
Web Design | | sbetzen
The page address is http://www.ecowindchimes.com/v/404.asp It loads fine... it is the page all 404's are directed to. Why is it showing a 404 error. The page works. How can this be fixed? Stephen0 -
Is the 'too many links' caused by the tags?
Hello Just got my seomoz report and decided I better start takiling things. Got a lot of 'too many links' on the report. I don't have control over website design and before I talk to designer I thought I should have a bit of an handle on what I am talking about. I've taken one page that has 483 links. Is this caused by the tags box and would it be a good idea to have it removed? http://commonwealthcontractors.com/uk-visas/tier-2-general-visas-formerly-uk-work-permits/ Regards Niamh
Web Design | | Niamh20 -
Question #1: Does Google index https:// pages? I thought they didn't because....
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored) My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one. The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/ instead of **http://**www.example.com/example-page/ To double check that this was causing a loss in Link Juice. I jumped over to OSE. Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed. So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed... Right?? Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed. The problem is.. is this a volusion problem? Should I switch to Wordpress? here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress) http://www.uncommonthread.com/
Web Design | | TylerAbernethy0 -
Changing URL's for a website redesign
Hi folks, We're redesigning our website and looking for some advice on how changing our URL's would affected our rankings. If the page URLs are changing how can we carry out redirects to avoid losing any SEO rank? Thanks, Ross
Web Design | | Will_Craig0 -
Why can't I ask this question - It is not too short
I tried to post a question which was at least 15 words long and received an error saying the question was less than 5 characters QrXcp
Web Design | | FFTCOUK0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0