Site Crawl Status code 430
-
Hello,
In the site crawl report we have a few pages that are status 430 - but that's not a valid HTTP status code. What does this mean / refer to?
https://en.wikipedia.org/wiki/List_of_HTTP_status_codes#4xx_Client_errorsIf I visit the URL from the report I get a 404 response code, is this a bug in the site crawl report?
Thanks,
Ian.
-
Which, of course, you can't do in Shopify.
Maybe we should just collectively get on Shopify to implement this by default.
-
It's all in this help document:
https://moz.com/help/moz-procedures/crawlers/rogerbot
"Crawl Delay To Slow Down Rogerbot
We want to crawl your site as fast as we can, so we can complete a crawl in good time, without causing issues for your human visitors.
If you want to slow rogerbot down, you can use the Crawl Delay directive. The following directive would only allow rogerbot to access your site once every 10 seconds:
User-agent: rogerbot
Crawl-delay: 10"
So you'd put the specified rule in your robots.txt file
-
This is happening to a client of mine too. Is there a way to set my regular MOZ Pro account to crawl the site slower?
-
This is a common issue with Shopify hosted stores, see this post:
It seems to be related to crawling speed. If a bot crawls your site too fast, you'll get 430s.
It may also be related to the proposed, 'additional' status code 430 documented here:
"430 Request Header Fields Too Large
This status code indicates that the server is unwilling to process the request because its header fields are too large. The request MAY be resubmitted after reducing the size of the request header fields."
I'd probably look at that Shopify thread and see if anything sounds familiar
-
@Angler - yeah thought the same - but why not log it as a 403 in the report. The site is hosted on Shopify - so don't get access to logs unfortunately.
Was wandering if it was related to rate limiting as in a few cases it's a false positive and page loads fine.
Have emailed Eli - thanks,
Best.
Ian.
-
-
Hey Ian,
Thanks for reaching out to us!
Would you be able to contact us at help@moz.com so that we can take a closer look at your Campaign.
Looking forward to hearing from you,
Eli
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website can't be crawled
Hi there, One of our website can't be crawled. We did get the error emails from you (Moz) but we can't find the solution. Can you please help me? Thanks, Tamara
Product Support | | Yenlo0 -
Campaign Tracking on Site with different CMS systems
Hello, We have a lot of international tracking issues. our site is www.avepoint.com/de our blog site is www.avepoint.com/blog/de The main site is hosted on Craft CMS, the other is on WordPress. Right now I am not seeing any rankings for blogs on our WordPress site but am seeing that I am winning words in Germany when I use google ad preview. I see good ranking stats for the main site. I am not sure why I cant see any data for the blog site. Any ideas what might be wrong or how I can fix it? Thanks, Amanda
Product Support | | AvePoint1 -
Crawl still in process for 3 days. Not sure why the site isn't being crawled
I added a new site to the crawl, but it seems to be stalled. It was supposed to crawl Feb 19, but it is still in process Feb 22. It tried to crawl the site and there was a robots.txt issue, but that issue was resolved way before the 19th. Not sure what is going on. this is for the clear lake campaign.
Product Support | | dpsoftware0 -
Crawling issue
Hello,
Product Support | | Benjamien
I have added the campaign IJsfabriek Strombeek (ijsfabriekstrombeek.be) to my account. After the website had been crawled, it showed only 2 crawled pages, but this site has over 500 pages. It is divided into four versions: a Dutch, French, English and German version. I thought that could be the issue because I only filled in the root domain ijsfabriekstrombeek.be , so I created another campaign with the name ijsfabriekstrombeek with the url ijsfabriekstrombeek.be/nl . When MOZ crawled this one, I got the following remark:
**Moz was unable to crawl your site on Feb 21, 2018. **Your page redirects or links to a page that is outside of the scope of your campaign settings. Your campaign is limited to pages with ijsfabriekstrombeek.be/nl in the URL path, which prevents us from crawling through the redirect or the links on your page. To enable a full crawl of your site, you may need to create a new campaign with a broader scope, adjust your redirects, or add links to other pages that include ijsfabriekstrombeek.be/nl. Typically errors like this should be investigated and fixed by the site webmaster. I have checked the robots.txt and that is fine. There are also no robots meta tags in the code, so what can be the problem? I really need to see an overview of all the pages on the website, so I can use MOZ for the reason that I prescribed, being SEO improvement. Please come back to me soon. Is there a possibility that I can see someone sort out this issue through 'Join me'? Thanks0 -
Haven't received an update on site crawl issues more than a week
Hello, my account has be scheduled to have the next updated report on 1st March. However, up till now, the latest data I have for our site crawl issues is made on 21st Feb. May I know if there is any issue related to this? Any way that i can draw the data for this week?
Product Support | | Robylin10 -
I have removed a subdomain from my main domain. We have stopped the subdomain completely. However the crawl still shows the error for that sub-domain. How to remove the same from crawl reports.
Earlier I had a forum as sub-domain and was mentioned in my main domain. However i have now discontinued the forum and have removed all the links and mention of the forum from my main domain. But the crawler still shows error for the sub-domain. How to make the crawler issues clean or delete the irrelevant crawl issues. I dont have the forum now and no links at the main site, bu still shows crawl errors for the forum which doesnt exist.
Product Support | | potterharry0 -
Number of pages crawled = 1; Why?
Since November, we've been trying to figure out why, when I select Crawl Diagnostics, my number of pages crawled is only 1. In mid-november, we changed our URL. That is, we went from www.example.com/home-page/ to www.example.com/new-home-page/. My first assumption was that I needed to re-create my Moz profile. That didn't fix it. The only crawl error we get is the no rel="cannonical" found -- but it's there. We find it on every page, including the home page. Our content shows up in search. Moz bar shows us info for every page. I just don't know what else to check. Everything else in my dashboard seems to look as expected. Specifically, I've turned to Crawl Diagnostics to find 4XX errors on our site. Typically we find one or two per week. Sometimes 0. Sometimes 4 or more. But it's been 0 since November. I highly doubt we've arrived at perfection. Any thoughts?
Product Support | | seo-nicole0 -
Crawl Limit Question
I'm a little confused as to how the crawl limit works. Since there seems to be a 10K per week max, the crawl limit can't be per week, so what is the time period? Also, does that include crawling sites entered as competitors? Right now I'm at 14/25 sites and most of them are under 1,000 pages so I'm not sure how I hit that limit (other than a one-time spike of 28,000 in November).
Product Support | | David_Moceri0