Rogerbot does not catch all existing 4XX Errors
-
Hi I experienced that Rogerbot after a new Crawl presents me new 4XX Errors, so why doesn't he tell me all at once?
I have a small static site and had 9 crawls ago 10 4XX Errors, so I tried to fix them all.
The next crawl Rogerbot fount still 5 Errors so I thought that I did not fix them all... but this happened now many times so that I checked before the latest crawl if I really fixed all the errors 101%.Today, although I really corrected 5 Errors, Rogerbot digs out 2 "new" Errors. So does Rogerbot not catch all the errors that have been on my site many weeks before?
Pls see the screenshot how I was chasing the errors
-
I understand,
I am not using a CMS and the site is not very big, so I wondered why Roberbot did not find all the 404 Error at the first time, because they have been there for many months.
Holger
-
Hey Holger,
Our crawler will catch as many errors as it can. It's possible that these errors were not present or just were not found at the time of the crawl.I'm running a crawl test to see if there's any discrepancy between your current campaign crawl and mine just to double-check.
In general, Kyle is correct that sometimes those errors just crop up, especially if you're using any sort of CMS.
I hope that helps. I'll update here after my crawl test is done.
Cheers,
Joel. -
Hi Holger,
4XX Errors can be quite common depending on your site setup so don't be surprised that Roger will keep returning errors for you to fix.
I would advise checking this data against GWT's own crawl error data which you can find in Webmaster Tools under Health>Crawl Errors.
I hope that helps,
K
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rookie question re Moz Crawl errors after deleting a property from console.
Hi all, I stupidly removed the "http" url of my one website a few days back (it is one of three, the other two being the https), then re-added it around a day later and, while google console isn't reporting back any errors, Moz Crawl is going to town on this now for one critical "4xx" issues, canonicals and various other content issues that I addressed days previously...last Moz crawl performed an hour ago, url deleted and re-added two days ago. I have resubmitted a sitemap, will this smooth itself out or shall I go and make changes? Many thanks in advance.
Moz Pro | | UkPod0 -
Error Code 902 & 403
Several thousand of these popped up on my Crawl Report and the links appear to be searches, i.e. below 902: http://thespacecollective.com/index.php?route=product/search&tag=nasa+ma-1+jacket%2F%2F%2F%2F%2F%2F%2F%2F%2F%2F%2F 403: http://thespacecollective.com/index.php?route=product/search&tag=periodic+table+tshirt%2F%2F%2F%2F%2F%2F%2F%2F%2F%2F%2F I don't want Moz, let alone Google finding this kind of nonsensical link but I don't exactly know what the problem is or how to fix it. Am I right in thinking these are pages people have searched for? Can anyone shed light on this please?
Moz Pro | | moon-boots0 -
Issues with Moz producing 404 Errors from sitemap.xml files recently.
My last campaign crawl produced over 4k 404 errors resulting from Moz not being able to read some of the URLs in our sitemap.xml file. This is the first time we've seen this error and we've been running campaigns for almost 2 months now -- no changes were made to the sitemap.xml file. The file isn't UTF-8 encoded, but rather Content-Type:text/xml; charset=iso-8859-1 (which is what Moveable Type uses). Just wondering if anyone has had a similar issue?
Moz Pro | | BriceSMG0 -
404 Errors generating in WP
Our crawl reports are generating back several 404 errors for pages with urls that look like: /category/consulting/page/5/ The tag changes, the page number changes, but the result is always the same: A big glaring 404. Our sites are built on WordPress Multi-site, and I am fairly certain this issue is on the WP end, but I can't figure out why it is generating pages out to infinity, essentially, from the tags and categories. It is worse on some sites than others, but is happening across the board (my initial concern was that it might be a theme issue, but that does not seem to be the case). If anyone has run into this issue and knows a fix, you're insight would be greatly appreciated. Thanks!
Moz Pro | | SIXSEO0 -
Is there a way to get SEOMOZ to not throw an error if i'm using the rel=canonical tag?
There are so many errors (~1500) that I can't find the pages with duplicate content among the ones that are correctly tagged
Moz Pro | | seospeedwagon0 -
Crawl Errors from URL Parameter
Hello, I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages associated with /login. I will see site.com/login?r=http://.... and have several duplicate content issues associated with those urls. Seeing this, I checked WMT to see if the Google crawler was showing this error as well. It wasn't. So what I ended doing was going to the robots.txt and disallowing rogerbot. It looks like this: User-agent: rogerbot Disallow:/login However, SEOmoz has crawled again and it still picking up on those URLs. Any ideas on how to fix? Thanks!
Moz Pro | | WrightIMC0 -
How can i locate the links on my site that are causing 404 errors?
In the 404 error report, there is no way to find what page on my site has the broken link.. how can i find them??? help. Matt
Moz Pro | | seo4anyone0 -
Broken Links and Duplicate Content Errors?
Hello everybody, I’m new to SEOmoz and I have a few quick questions regarding my error reports: In the past, I have used IIS as a tool to uncover broken links and it has revealed a large amount of varying types of "broken links" on our sites. For example, some of them were links on my site that went to external sites that were no longer available, others were missing images in my CSS and JS files. According to my campaign in SEOmoz, however, my site has zero broken links (4XX). Can anyone tell me why the IIS errors don’t show up in my SEOmoz report, and which of these two reports I should really be concerned about (for SEO purposes)? 2. Also in the "errors" section, I have many duplicate page titles and duplicate page content errors. Many of these "duplicate" content reports are actually showing the same page more than once. For example, the report says that "http://www.cylc.org/" has the same content as "http://www.cylc.org/index.cfm" and that, of course, is because they are the same page. What is the best practice for handling these duplicate errors--can anyone recommend an easy fix for this?
Moz Pro | | EnvisionEMI0