Dealing with Not Found errors
-
Hi,
I have a problem with Google Webmaster, it reported that I have many errors 404 Not Found.
I have detected and found the links are coming from my site, I have fixed them and submit a new sitemap to google again. Then, I wait a few days, google still report errors 404 Not found same as before.I don't know why? Please help!
-
That is why I 301 redirect all of my 404 banks simply because Google has already indexed and archived them as I mentioned. So you need to 301 redirect the links I hope this helps sincerely, Thomas
-
I have marked as fix many times, and then the errors still occur. When I check my sitemap and did not see the not found links which are listed in the tab linked from sitempap.
-
considering that your website has both index and archiving on and Google obviously has indexed and archived your website I would strongly suggest you eliminate this is a obstacle in the future and simply 301 redirected the correct path
http://www.vietnamvisacorp.com/tips/6.html
If using Nginx like I think you are simply implement this in your htaccess
nginx configuration location /tips/6.html { rewrite ^(.*)$ http://www.vietnamvisacorp.com/tips.html redirect; }
I used this tool to accomplish this goal http://winginx.com/htaccess you can do the exact same thing using this and it will help you if you already understand how 301 redirect for Apache
http://www.anilcetin.com/convert-apache-htaccess-to-nginx/
if you want to be certain to get the code right which I would I would use an Apache generator then put it into the Nginx generator and properly 301 redirect your 404
simply use the tool below to create the and copy and paste that redirect into one of the 2 tools I have given you above.
-
Hello John & Dennis,
if it is completely gone and there's no 404 you can check this by going to the Google Webmaster tools panel and instead of selecting ignore or anything like that hit fetch as a Google bot this will tell you if the link is still a real link or not. You do not have to wait a set amount of time to find out if one link is in fact okay.
I would be hard-pressed to imagine that it is worth anything meaning it has back links pointing to it however if it does you will want to use open site Explorer to check the link prior to ignoring it or marking it as fixed.
The rule I use is if it shows up as a 404 I 301 redirect it that way you never lose back links. It's a little more complicated using Nginx but in my opinion it's worth the complication as the improved speed and ability to take incredible amount of traffic make it really worth it.
What Dennis has stated is true if the link is worthless And of course does not exist Anywhere on a webpage.
I hope this has been of help.
Thomas
-
here is the information I retrieved regarding how your website is made website
https://builtwith.com/?http%3A%2F%2Fwww.vietnamvisacorp.com%2Ftips.html
http://www.vietnamvisacorp.com/tips.html
http://wiki.nginx.org/LikeApache-htaccess
so let's say you wanted to make it back into the original URL I have above
to fix it and Apache
Permanent URL redirect - generated by www.rapidtables.com
Redirect 301 /tips/6.html http://www.vietnamvisacorp.com/tips.html
now to fix it in Nginx
Use
or
http://www.anilcetin.com/convert-apache-htaccess-to-nginx/
nginx configuration location /tips/6.html { rewrite ^(.*)$ http://www.vietnamvisacorp.com/tips.html redirect; }
http://www.vbseo.com/f77/converting-htaccess-301-redirect-nginx-49834/
-
Hi John
Mark it as fixed in webmaster tools
After that, just leave it alone for a week or so. Once your site is recrawled, youll see that it's not there anymore (as long as it's really fixed)
-
this is not a problem you simply have to take these links that you're finding is 404's and 301 redirect them to relevant pages on your website. What I mean is do not simply take them and put them all to the home page. Spread them around if they have no direct use. I've noticed you are using Nginx along with Apache where you might be just using Nginx in order to make the 301 redirects correctly with engine acts I'm going to need to grab a couple links for you. I will be right back.
sincerely,
Thomas
-
For an example about this problem:
Google webmmaster report Not found link: www.vietnamvisa...com/tips/6.html (this is link before update).
and in the tab Linked from: http://www.vietnamvisacorp.com/tips.html, but in this page did not include /tips/6.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Optimization Error
Hi, I am trying to track a page optimization feature for one of my project, https://shinaweb.com but i keep getting this below error: "PAGE OPTIMIZATION ERROR
On-Page Optimization | | shinawebnavid
There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page." I checked robots.txt file, it all looks fine. Not sure what is the problem? Is it a problem with Moz or the website?0 -
Hreflang Errors 404 vs "Page Not Found"
For a websites that differ between catalogs (PDPs) what hreflang error causes the least harm? Obviously the best solution is to only have hreflang for shared products, but this takes more work to implement. So when no identical product exists... 1. Hreflang points to 404 or 410 error. 2. Hreflang points to 200 status "Page Not Found" page. This obviously has the additional issue of needing to point back to 100+ urls. I want to avoid having Google decide to ignore all hreflang due to errors as many correct urls will exist. Any thoughts?
On-Page Optimization | | rigelcable0 -
Strange error on my website
Hi Wondering if anyone could help out here please. I tried to crawl my website using screaming frog and got a 500 error warning (odd considering it worked yesterday). I tried google speed test and got the following message - PageSpeed Insights received a 500 response from the server. To analyze pages behind firewalls or that require authentication. However, it also crawled the site and gave me a score of 70/100. It showed the site thumbnail, as normal. This was updated this morning and it showed the latest one so must have crawled. I can access the site, as can my 'people', the site is getting hits from real users in Google Analytics so must be working. Am very puzzled how the site can be working and not be working at the same time! If possible i would rather not share the url. The site is a test of concept and is very messy looking. it is more a test to see what happens if i do x... Many thanks Carl
On-Page Optimization | | daedriccarl0 -
400 error - Phone number link.
I am getting 400 errors for all my pages that have a phone number with a link to Skype etc on click, is this a genuine issue or am I ok? How do I resolve this? Any bright ideas, here is an example of the issue - http://www.arts1.co.uk/5-reasons-to-choose-arts1 There are pages of these and I am not sure what to do? Many Thanks James Grimsey
On-Page Optimization | | jamesgrimsey0 -
Dealing with spelling variations
Hi, my site is a directory for restaurants seen on TV. The two most popular shows, Diners, Drive-Ins and Dives and Man v. Food often are searched for by a number of different variations. Diners DriveIns Dives, Diners, Drive Ins Dives, Man v Food, man versus food, etc. Should my site consistently use a single variation (the official one) or intentionally use multiple variations to cover various keywords? I'm pretty sure the answer is a single variation but figured it was worth asking.
On-Page Optimization | | tvfoodmaps1 -
Why are my tags causing duplicate error?
Hi, When I run an SEO moz crawl it is picking up duplicate errors from my tags. Currently I have these set as domain url/tag/name of the tag. Is this incorrect? Is there anything I can do to stop this being picked up as duplicate pages? Thanks, Eliz
On-Page Optimization | | econley0 -
How do I avoid duplicate content and page title errors when using a single CMS for a website
I am currently hosting a client site on a CMS with both a Canadian and USA version of the website. We have the .com as the primary domain and the .ca is re-directed from the registrar to the Canadian home page. The problem I am having is that my campaign produces errors for duplicate page content and duplicate page titles. Is there a way to setup the two versions on the CMS so that these errors do not get produced? My concern is getting penalized from search engines. Appreciate any help. Mark Palmer
On-Page Optimization | | kpreneur0 -
URL 404 errors after crawl? HELP!
I am getting Crawl errors. It shows multiple pages as. I know this is more of a technical question however, I cannot find the answer anywhere. I'm using wordpress www.mydomain.com/title-of-page/mydomain.com/contact WHAT IS THIS?!
On-Page Optimization | | ChristineWeinbrecht0