How to avoid 404 errors when taking a page off?
-
So...
We are running a blog that was supposed to have great content.
Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better.
In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda.
So we decided to restard our blog from zero and make a better try.
So. Every page was already ranking in Google.
SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors.
My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects.
Does Google penalyses me for that? It's kinda obvious for me that the answer is YES.
Please, help
-
Thanks for your help.
-
There won't be any penalty if 404 pages are redirected to other pages. At the same time Google prefers to see 404 error for pages that are not available.
Couple of steps may reduce 404s:
1. Check on page / internal links if they are working URLs and update them with valid URLs
2. If you are linking your pages from different site, check the URLs used
3. If you are using free or paid directories, paid campaigns etc. check if the URLs are valid
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's Worse - 404 errors or a huge .htaccess file
We have changed our site architecture pretty significantly and now have many fewer pages (albeit with more robust content and focused linking). My question is, what should I do about all the 404 errors (keep in mind, I am only finding these in Bing Webmaster tools, not Moz or GWT)? Is it worse to have all those 404 errors (hundreds), or to have a massive htaccess file for pages that are only getting hits by the Bing crawlbot. Any insight would be great. Thanks
Technical SEO | | CleanEdisonInc0 -
Error 404, Wordpress adds the domain automaticly to the end of the pages, WHY?
Hello guys, I'm using wordpress and the Yoast to help me improve my SEO. Everything went well except for today because "Moz" found 404 errors when scrolling the website saying showing the domain of my website at the end of 12 url. For example :
Technical SEO | | abonnisseau
www.domain.com/service-1/www.domain.com www.domain.com/contact-page/**www.domain.com ** Do you have any idea where does that come from ? Thanks Alex0 -
Can you redirect from a 410 server error? I see many 410s that should be directed to an existing page.
We have 150,000 410 server errors. Many of them should be redirected to an existing url. This is a result of a complete website redesign, including new navigation and new web platform. I believe IT may have inadvertently marked many 404s as 410s. Can I fix this or is a 410 error permanent? Thank you for your help.
Technical SEO | | sxsoule0 -
Mass 404 Checker?
Hi all, I'm currently looking after a collection of old newspaper sites that have had various developments during their time. The problem is there are so many 404 pages all over the place and the sites are bleeding link juice everywhere so I'm looking for a tool where I can check a lot of URLs at once. For example from an OSE report I have done a random sampling of the target URLs and some of them 404 (eek!) but there are too many to check manually to know which ones are still live and which ones have 404'd or are redirecting. Is there a tool anyone uses for this or a way one of the SEOMoz tools can do this? Also I've asked a few people personally how to check this and they've suggested Xenu, Xenu won't work as it only checks current site navigation. Thanks in advance!
Technical SEO | | thisisOllie0 -
Duplicates on the page
Hello SEOMOZ, I've one big question about one project. We have a page http://eb5info.com/eb5-attorneys and a lot of other similar pages. And we got a big list of errors, warnings saying that we have duplicate pages. But in real not all of them are same, they have small differences. For example - you select "State" in the left sidebar and you see a list on the right. List on the right panel is changing depending on the what you selecting on the left. But on report pages marked as duplicates. Maybe you can give some advices how to improve quality of the pages and make SEO better? Thanks Igor
Technical SEO | | usadvisors0 -
Page rank and ranking down
Hi I blog at Technostarry. Some 3 months back during page rank update, my page rank went down from 3 to 2. I don't know the reason behind this. And now, my traffic and ranking is also down. I am not involved in any bad SEO practices, I don't copy paste and I write original content. I am too confused as why and what has happened with my site. If someone could analyze my blog and look at my weak points then that would be great. I would like to get any suggestions to get back my ranking and also page rank back. Thanks.
Technical SEO | | technotech0 -
Secondary Pages Indexed over Primary Page
I have 4 pages for a single product Each of the pages link to the Main page for that product Google is indexing the secondary pages above my preferred landing page How do I fix this?
Technical SEO | | Bucky0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0