Easy Fix for 404 Errors foe Newbie
-
Hey there,
I have two errors at these links that to my knowledge do not exist on my domain according to the MOZ.
http://educateathletes.com/post/23804085842/educateathletes-ushl-gm-head-coach-jim
http://educateathletes.com/products
I'm really not sure what to do. The first is an old Tumblr blog post. The second is a page that was created on my site but the URL title was changed to
http://educateathletes.com/enroll
Any advice is appreciated to eliminate this.
Sean
-
Since you're using WordPress, Sean, I can strongly recommend the Redirection plugin. Once it's installed and configured, it will catch things like the URL you changed, and automatically redirect to the new URL you created. (You'll need to do this by hand i the plugin for that particular URL though, as you've already changed it before the plugin could catch it.)
In addition, it provides an easy interface to create redirects for situations like your other one where an incoming link needs to be easily repointed to a different page. It will even inform you right in your WP dashboard when a new 404 error has been discovered and allow you to easily fix/redirect it right there and then.
Far easier than having to dive into your .htaccess file every time you need to make a minor change.
It should be noted though - if you have a huge number of redirects to create, THAT should be done in htaccess as it processes much faster using fewer server resources than a large redirect list in the plugin.
Paul
-
SEO Moz probably missed it because the inbound link is no longer active. Did you pull this error from Google Webmaster Tools? Check the date. It may be a very old error. Either way, implementing a 301 is your best bet.
-
You could do a 301 redirect in your .htaccess file to route to the page you want: Redirect 301 /oldpage.htm http://www.newpage.com/.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Quest about 404 Errors
About two months ago, we deleted some unnecessary pages on our website that were no longer relevant. However, MOZ is still saying that these deleted pages are returning 404 errors when a crawl test is done. The page is no longer there, at least that I can see. What is the best solution for this? I have a page that similar to the older page, so is it a good choice to just redirect the bad page to my good page? If so, what's the best way to do this. I found some useful information searching but none of it truly pertained to me. I went around my site to make sure there were no old links that directed traffic to the non existent page, and there are none.
Technical SEO | | Meier0 -
How can I fix this home page crawl error ?
My website shows this crawl error => 612 : Home page banned by error response for robots.txt. I also did not get any page data in my account for this website ... I did get keyword rankings and traffic data, I am guessing from the analytics account. url = www.mississaugakids.com Not sure really what to do with this ! Any help is greatly appreciated.
Technical SEO | | jlane90 -
Pages appear fine in browser but 404 error when crawled?
I am working on an eCommerce website that has been written in WordPress with the shop pages in E commerce Plus PHP v6.2.7. All the shop product pages appear to work fine in a browser but 404 errors are returned when the pages are crawled. WMT also returns a 404 error when ‘fetch as Google’ is used. Here is a typical page: http://www.flyingjacket.com/proddetail.php?prod=Hepburn-Jacket Why is this page returning a 404 error when crawled? Please help?
Technical SEO | | Web-Incite0 -
GWT Error for RSS Feed
Hello there! I have a new RSS feed that I submitted to GWT. The feed validates no problemo on http://validator.w3.org/feed/ and also when I test the feed in GWT it comes back aok, finds all the content with "No errors found". I recently got a issue with GWT not being able to read the rss feed, error on line 697 "We were unable to read your Sitemap. It may contain an entry we are unable to recognize. Please validate your Sitemap before resubmitting." I am assuming this is an intermittent issue, possibly we had a server issue on the site last night etc. I am checking with my developer this morning. Wanted to see if anyone else had this issue, if it resolved itself, etc. Thanks!
Technical SEO | | CleverPhD0 -
Miss meta description on 404 page
Hi, My 404 page did not have meta description. Is it an error? Because I run report and seomoz said that a problem. Thanks!
Technical SEO | | JohnHuynh0 -
Htaccess - multiple matches by error
Hi all, I stumbled upon an issue on my site. We have a video section: www.holdnyt.dk/video htaccess rule: RewriteCond %{REQUEST_FILENAME} !-f
Technical SEO | | rasmusbang
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^video index.php?area=video [L,QSA] Problem is that these URLs give the same content:
www.holdnyt.dk/anystring/video
www.holdnyt.dk/whatsoever/video Any one with a take on whats wrong with the htaccess line? -Rasmus0 -
Odd URL errors upon crawl
Hi, I see this in Google Webmasters, and am now also seeing it here...when a crawl is performed on my site, I get many 500 server error codes for URLs that I don't believe exist. It's as if it sees a normal URL but adds this to it: %3Cdiv%20id= It's like this for hundreds of URLs. Good URL that actually exists http://www.ffr-dsi.com/food-retailing/supplies/ URL that causes error and I have no idea why http://www.ffr-dsi.com/food-retailing/supplies/%3Cdiv%20id= Thanks!
Technical SEO | | Matt10 -
Nginx 403 and 503 errors
I have a client with a website that is hosted on a shared webserver running on an Nginx server. When I started working on the website a few months ago I found the server was throwing 100s of 403s and 503s and at one point googlebot couldn't access robots.txt. Needless to say this didn't help rankings! Now the web hosting company has partially resolved the errors by switching to a new server and I'm now just seeing intermittent spikes in Webmaster Tools of 30 to 70 403 ad 503 errors. My questions: Am I right in saying there should (pretty much) be no such errors (for pages that we make public and crawlable). Having already asked the web hosting company to look in to this. Any advice on specifically what I should be asking them to look at on the server? If this doesn't work out, does anyone having a recommendation for a reliable web hosting company in the U.S. for a lead generation website with over 20,000 pages and currently 500 to 1000 visits per day? Thanks for the help Mozzers 🙂
Technical SEO | | MatShepSEO0