Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
429 Errors?
-
I have over 500,000 429 errors in webmaster tools. Do I need to be concerned about these errors?
-
I am getting the same 429 errors also? The only changes I have made are that I switched my account on Godaddy from a deluxe hostimg to a managed WordPress hosting account?
-
I highly doubt this error would have anything to do with that. I would also recommending cross checking those rankings with another party tool like Authority Labs - or you can look for your average position in Google Webmaster Tools. Moz runs rankings once a week, and sometimes it might happen to pick up on a temporary fluctuation. So I'd confirm the ranking drop before deciding what to do next
-
I have the same question. Also, these are the only errors on my site. MOZ shows that the main keyword for the site just dropped from ranked #1 to ranked #21 on Google. Does this error have anything to do with it?
-
Sounds like probably the same issue wcbuckner describes - if it's problem in any way I would contact GoDaddy about it and see what they have to say.
-
This error is shown only in Moz Analytics. It's oky everywhere except here.
-
I just got a lot of these on my Moz report and I also host with Godaddy. My issue is how do we know if it does or doesn't happen when Google crawls our site? I am trying to get a page rank and I hope this is not stopping my site from getting ranked.
-
What exactly is happening, the same 429 errors? Does wcbuckner's response explain it for you?
-
Facing same problem with Godaddy. But can anybody say how to resolve this problem please?
-
Having the same issue and just spoke with Godaddy who said it's not a concern and that what's happening is Moz's software is pinging the client's server too many times within a given time period, so Godaddy's system is temporarily blocking Moz's IP, which causes the error. The do not ever, according to this rep, block Google's services that hit the server.
-
Interesting. I too had not come across a 429 error either.
I crawled your site once with Screaming Frog at normal speed and got some 429 errors. Those pages are indexed and cached - so there does not seem to be a dire emergency.
I did a second crawl, slower, with Screaming Frog - and still got a few 429 errors but not nearly as many. Thing is though, even though pages are getting indexed and cached, some pages will throw the 429 error on some crawls, and then maybe not the next crawl. So it's enough to get through, but would be better to not have them.
From what I can tell, it seems this code is set at the server level - so perhaps you should contact your host to inquire about it. Are you on a normal hosting setup or are you going through something like WP Engine? The number of requests allowed needs to be increased. Or as Mike said, this could be an included API call that's causing it.
Hope that helps!
-Dan
-
I found this from the Internet Engineering Task Force (IETF):
"429 Too Many Requests
The 429 status code indicates that the user has sent too many requests in a given amount of time ("rate limiting").
The response representations SHOULD include details explaining the condition, and MAY include a Retry-After header indicating how long to wait before making a new request.
For example:
HTTP/1.1 429 Too Many Requests
Content-Type: text/html
Retry-After: 3600<title>Too Many Requests</title>
Too Many Requests
I only allow 50 requests per hour to this Web site per logged in user. Try again soon.
Note that this specification does not define how the origin server identifies the user, nor how it counts requests. For example, an origin server that is limiting request rates can do so based upon counts of requests on a per-resource basis, across the entire server, or even among a set of servers. Likewise, it might identify the user by its authentication credentials, or a stateful cookie.
Responses with the 429 status code MUST NOT be stored by a cache."
From doing a quick read, it looks like this error would be thrown when API request are called too quickly... so... yeah?
Sorry I can't be any more helpful.
Mike
-
Here is a screenshot of webmaster errors: http://prntscr.com/zc5p2.
-
Hmmm...
I have never hear of a 429 error. And that error isn't listed by Google Webmaster Tools or W3.org either.
If you mean a 409 error, that means Conflict, "The server encountered a conflict fulfilling the request. The server must include information about the conflict in the response. The server might return this code in response to a PUT request that conflicts with an earlier request, along with a list of differences between the requests."
If you do mean 429, can you provide a screenshot of it?
Thanks,
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Showing 404 errors for product pages not in sitemap?
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url). Is this expected? Will these errors eventually go away/stop being monitored by Google?
Technical SEO | | woshea0 -
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
How to get rid of bot verification errors
I have a client who sells highly technical products and has lots and lots (a couple of hundred) pdf datasheets that can be downloaded from their website. But in order to download a datasheet, a user has to register on the site. Once they are registered, they can download whatever they want (I know this isn't a good idea but this wasn't set up by us and is historical). On doing a Moz crawl of the site, it came up with a couple of hundred 401 errors. When I investigated, they are all pages where there is a button to click through to get one of these downloads. The Moz error report calls the error "Bot verification". My questions are:
Technical SEO | | mfrgolfgti
Are these really errors?
If so, what can I do to fix them?
If not, can I just tell Moz to ignore them or will this cause bigger problems?0 -
GMB Bulk Upload Error
Hello! I am continuing to have issues with the bulk upload option.Currently, there are 12 non-verified locations in a location group in my GMB account. I have approximately 6-8 more that need to be added to this group via bulk upload. When uploading the spreadsheet, I receive an error reading "You've exceeded the limit for the about of locations you can upload to Google My Business in a single day. Try again later." It seems to happen specifically to the locations that aren't in my GMB account already. The others, the ones already in the account, are fine and simply read "No updates" when the bulk upload sheet is read. Everything else is marked as an error. Why is it marking some listings as nonviable when they come in via the bulk verification spreadsheet, which has been downloaded directly from the links Google has provided, and filled in with the help of the sample and amenities list?How do we finish uploading all of the remaining locations?I have another group, separate group (same company, groups split into US and International) under my name that may also need a bulk upload - what can I do to avoid this error in the future? Can they still be bulk uploaded to my account after I upload the first location group's listings?If you could provide any guidance, I'd be very grateful.Thanks in advance!
Technical SEO | | kmarsh0 -
Sitemap error in Webmaster tools - 409 error (conflict)
Hey guys, I'm getting this weird error when I submit my sitemap to Google. It says I'm getting a 409 error in my post-sitemap.xml file (https://cleargear.com/post-sitemap.xml). But when I check it, it looks totally fine. I am using YoastSEO to generate the sitemap.xml file. Has anyone else experienced this? Is this a big deal? If so, Does anyone know how to fix? Thanks EwTswL4
Technical SEO | | Extima-Christian0 -
Schema Markup Errors - Priority or Not?
Greetings All... I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn. I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them. My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have. Any thoughts?
Technical SEO | | AfroSEO0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
How to remove the 4XX Client error,Too many links in a single page Warning and Cannonical Notices.
Firstly,I am getting around 12 Errors in the category 4xx Client error. The description says that this is either bad or a broken link.How can I repair this ? Secondly, I am getting lots of warnings related to too many page links of a single page.I want to know how to tackle this ? Finally, I don't understand the basics of Cannonical notices.I have around 12 notices of this kind which I want to remove too. Please help me out in this regard. Thank you beforehand. Amit Ganguly http://aamthoughts.blogspot.com - Sustainable Sphere
Technical SEO | | amit.ganguly0