What are the best ways to fix 404 errors?
-
I recently changed the url of my main blog and now have about 100 404 errors. I did a redirect from the old url to the new one however still have errors.
1. Should I do a 301 redirect from each old blog post url to the new blog post url?
2. Should I just delete the old blog post (url) and rewrite the blog post?
I"m not concerned about links to the old posts as a lot of them do not have many links.
-
Thanks Andy. I made this change........domain.com/blue-blog to domain.com/blog using a rewriterule. It seemed to work.
-
I did change the structure domain.com/blue-blog to domain.com/blog. So I did a rewrite rule in the .htaccess file. That fixed a lot of things however there are still 100 or so 404's. They are old blog posts and not really that important.
-
As tom says 404 errors are not the end of the world.
if you are concerned then as long as the relative urls have remained the same and the root directory is all that has changes a 301 in bulk should work, though if you've changed categories or something it may not work so well as a single entity and 100 would be the way to go.
Something that you should do, if you've not already, is within webmaster tools make sure you tell Google you've changed your url (configuration > change address) - it also has a mini guide on the steps you should be taking, including to register your new domain on webmaster tools.
But again, as tom says, if it's not destroying the user experience and isn't a huge annoyance for visitors don't worry too much about it.
--
Just for your reference a full url redirect (aka changing say abc.com to abc.net - moving all directories and urls in one go) would look like
RedirectMatch 301 ^(.*)$ http://www.abc.net
-
Are we talking about a structural change (i.e. domain.com/blog to domain.com/myblog) or a TLD change (domain.com to domain2.com)? If you kept the same blog structure otherwise, I would write a .htaccess file to make sure you just blanket redirect all URLs. It's easy to do that way but not everyone has access to that.
I recommend 301s just because they avoid the sloppiness problem. I mean, you wrote the content for people to find, right? If they hit a 404 it just frustrates them. It doesn't matter whether or not you need the SEO, I like it when a 301 takes me where I really need to go. it shows someone cared enough to make sure I could get to what they had done. It's a pride of authorship thing.
-
Hi Nathan
If you're not concerned about passing the links/link equity of the old posts to a new page, or if you don't think there are any users visiting the URL directly, then I would simply leave the page as a 404 error.
404s are a natural part of the course and Google recognises this - check out this webmaster blog post. 100 404s isn't an awful lot, so I wouldn't worry about them unless they're interrupting a user journey (which you'll be able to check in analytics).
If you really want to get rid of them, then a 301 would be the way to go in my opinion. 100 301s will not slow down your .htaccess file by any noticeable margin. But overall, I'd let the 404s be 404s.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to add semantic linked data to WordPress?
As a recent Moz subscriber, I'm trying to up my game in terms of inbound marketing. One of the most pressing tasks is to add json-ld across all of my WordPress sites. What is the best way to do this? Should I use the technique set out here: https://moz.com/blog/using-google-tag-manager-to-dynamically-generate-schema-org-json-ld-tags Or should I use one of these plugins? https://en-gb.wordpress.org/plugins/schema/ https://en-gb.wordpress.org/plugins/wp-structuring-markup/ I want to get this right so any guidance would be gratefully received.
Intermediate & Advanced SEO | | treb0r0 -
Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
Intermediate & Advanced SEO | | lcourse
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?0 -
Pages Returning A 403 Error
Hiya Moz Community I hope you are all great, I have a question regarding one of my websites, I have the main site and 2 sub folder sites essentially, I decided to upgrade one of the sites and placed it in a different sub folder, I then set up a 301 redirect to the new location, so far so good, I have been having a look at my link profile using AHrefs, inside there is an SEO report facility, I ran the report and I have over 500 pages returning a 403 or Forbidden error. my question is whether the Equity from those pages is being passed to the new site? I actually removed all the old site from Google Cache to avoid misleading visitors, I suppose I could set the re-directs up manually if I the equity is not being passed to the new site although I was under the impression it would be, or 85% - 90% of it would be anyway. The reason why I am asking is that I have seen a significant drop in rankings for keywords that my site has always ranked highly for. thought I would see if you guys can clear that up for me. Thanks and regards Wes Dunn
Intermediate & Advanced SEO | | wesdunn19770 -
Rich snippets error?
hello everyone, I have this problem with the rich snippets: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.visalietuva.lt%2Fimone%2Ffcr-media-lietuva-uab The problem is that it says some kind of error. But I can't figure it out what it is. We implemented the same code on our other websites: http://www.imones.lt/fcr-media-lietuva-uab and http://www.1588.lt/imone/fcr-media-lietuva-uab . The snippets appear on Google and works perfectly.
Intermediate & Advanced SEO | | FCRMediaLietuva
The only site that has this problem is visalietuva.lt I attached the image to show what I mean. I really need tips for this one. gbozIrt.png0 -
What is the best way to embed PDF documents for SEO?
I have been using SCRIBD to embed PDF documents on my site but until recently I did not include the link back to SCRIBD. Will my site get credit for this content or will it go to SCRIBD? Is there a better way to embed PDF documents for SEO?
Intermediate & Advanced SEO | | casper4340 -
Best way to block a search engine from crawling a link?
If we have one page on our site that is is only linked to by one other page, what is the best way to block crawler access to that page? I know we could set the link to "nofollow" and that would prevent the crawler from passing any authority, and we can set the page to "noindex" to prevent it from appearing in search results, but what is the best way to prevent the crawler from accessing that one link?
Intermediate & Advanced SEO | | nicole.healthline0 -
Best way to consolidate link juice
I've got a conundrum I would appreciate your thoughts on. I have a main container page listing a group of products, linking out to individual product pages. The problem I have is the all the product pages target exactly the same keywords as the main product page listing all the products. Initially all my product pages were ranking much higher then the container page, as there was little individual text on the container page, and it was being hit with a duplicate content penality I believe. To get round this, on the container page, I have incorporated a chunk of text from each product listed on the page. However, that now means "most" of the content on an individual product page is also now on the container page - therefore I am worried that i will get a duplicate content penality on the product pages, as the same content (or most of it) is on the container page. Effectively I want to consolidate the link juice of the product pages back to the container page, but i am not sure how best to do this. Would it be wise to rel=canonical all the product pages back to the container page? Rel=nofollow all the links to the product pages? - or possibly some other method? Thanks
Intermediate & Advanced SEO | | James770 -
Best tool to calculate link distribution?
What is the best tool to calculate the total link distribution throughout a site? I know opensiteexplorer.com's "top pages" breaks down the numbers for you? Are there any others?
Intermediate & Advanced SEO | | nicole.healthline0