429 Errors?
-
I have over 500,000 429 errors in webmaster tools. Do I need to be concerned about these errors?
-
I am getting the same 429 errors also? The only changes I have made are that I switched my account on Godaddy from a deluxe hostimg to a managed WordPress hosting account?
-
I highly doubt this error would have anything to do with that. I would also recommending cross checking those rankings with another party tool like Authority Labs - or you can look for your average position in Google Webmaster Tools. Moz runs rankings once a week, and sometimes it might happen to pick up on a temporary fluctuation. So I'd confirm the ranking drop before deciding what to do next
-
I have the same question. Also, these are the only errors on my site. MOZ shows that the main keyword for the site just dropped from ranked #1 to ranked #21 on Google. Does this error have anything to do with it?
-
Sounds like probably the same issue wcbuckner describes - if it's problem in any way I would contact GoDaddy about it and see what they have to say.
-
This error is shown only in Moz Analytics. It's oky everywhere except here.
-
I just got a lot of these on my Moz report and I also host with Godaddy. My issue is how do we know if it does or doesn't happen when Google crawls our site? I am trying to get a page rank and I hope this is not stopping my site from getting ranked.
-
What exactly is happening, the same 429 errors? Does wcbuckner's response explain it for you?
-
Facing same problem with Godaddy. But can anybody say how to resolve this problem please?
-
Having the same issue and just spoke with Godaddy who said it's not a concern and that what's happening is Moz's software is pinging the client's server too many times within a given time period, so Godaddy's system is temporarily blocking Moz's IP, which causes the error. The do not ever, according to this rep, block Google's services that hit the server.
-
Interesting. I too had not come across a 429 error either.
I crawled your site once with Screaming Frog at normal speed and got some 429 errors. Those pages are indexed and cached - so there does not seem to be a dire emergency.
I did a second crawl, slower, with Screaming Frog - and still got a few 429 errors but not nearly as many. Thing is though, even though pages are getting indexed and cached, some pages will throw the 429 error on some crawls, and then maybe not the next crawl. So it's enough to get through, but would be better to not have them.
From what I can tell, it seems this code is set at the server level - so perhaps you should contact your host to inquire about it. Are you on a normal hosting setup or are you going through something like WP Engine? The number of requests allowed needs to be increased. Or as Mike said, this could be an included API call that's causing it.
Hope that helps!
-Dan
-
I found this from the Internet Engineering Task Force (IETF):
"429 Too Many Requests
The 429 status code indicates that the user has sent too many requests in a given amount of time ("rate limiting").
The response representations SHOULD include details explaining the condition, and MAY include a Retry-After header indicating how long to wait before making a new request.
For example:
HTTP/1.1 429 Too Many Requests
Content-Type: text/html
Retry-After: 3600<title>Too Many Requests</title>
Too Many Requests
I only allow 50 requests per hour to this Web site per logged in user. Try again soon.
Note that this specification does not define how the origin server identifies the user, nor how it counts requests. For example, an origin server that is limiting request rates can do so based upon counts of requests on a per-resource basis, across the entire server, or even among a set of servers. Likewise, it might identify the user by its authentication credentials, or a stateful cookie.
Responses with the 429 status code MUST NOT be stored by a cache."
From doing a quick read, it looks like this error would be thrown when API request are called too quickly... so... yeah?
Sorry I can't be any more helpful.
Mike
-
Here is a screenshot of webmaster errors: http://prntscr.com/zc5p2.
-
Hmmm...
I have never hear of a 429 error. And that error isn't listed by Google Webmaster Tools or W3.org either.
If you mean a 409 error, that means Conflict, "The server encountered a conflict fulfilling the request. The server must include information about the conflict in the response. The server might return this code in response to a PUT request that conflicts with an earlier request, along with a list of differences between the requests."
If you do mean 429, can you provide a screenshot of it?
Thanks,
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Errors In Search Console
Hi All, I am hoping someone might be able to help with this. Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page). When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop. These are: wp-admin/admin-ajax.php showing as response code 400 and also xmlrpc.php showing as response code 405 robots.txt is as follows: user-agent: * disallow: /wp-admin/ allow: /wp-admin/admin-ajax.php Any help with what is wrong here and how to fix it would be greatly appreciated. Many Thanks
Technical SEO | | DaleZon0 -
Xml sitemaps giving 404 errors
We have recently made updates to our xml sitemap and have split them into child sitemaps. Once these were submitted to search console, we received notification that the all of the child sitemaps except 1 produced 404 errors. However, when we view the xml sitemaps in a browser, there are no errors. I have also attempted crawling the child sitemaps with Screaming Frog and received 404 responses there as well. My developer cannot figure out what is causing the errors and I'm hoping someone here can assist. Here is one of the child sitemaps: http://www.sermonspice.com/sitemap-countdowns_paged_1.xml
Technical SEO | | ang0 -
Product Code Error in Volusion
I started working with about 800+ 404 errors in September after we migrated our site to Volusion 13. There is a recurring 404 error that I can't trace inside of our source code or in our Sitemap. I don't know what is causing this error so I have no way of knowing how to fix it. Tech support at Volusion has been less than helpful so any feed back would be appreciated. | http://www.apelectric.com/Generac-6438-Guardian-Series-11kW-p/{1} | The error is seemingly starting with the product code. The addendum at the end of the URL "p/" should be followed by the product code. In this example, 6438. Instead, the code is being automatically populated with %7B1%7D Has anyone else this issue with Volusion or does this look familiar across any other platform?
Technical SEO | | MonicaOConnor0 -
Duplicate title error in GWT over spelling in URL
Hi, How do I resolve a duplicate title error in GWT over spelling in URL? Ttile of Post: Minneapolis Median Home Sales Price Up 16 Percent Not sure how this happened, but I have two URL versions show up. Even with a 301 redirect, the both remain an error in GWT. /real-estate-blog/Minneapolis-median-home-sales-price-up-16-percent and /real-estate-blog/minneapolis-median-home-sales-price-up-16-percent
Technical SEO | | jessential0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
404 error
Both SEOmoz and Google webmaster tools are returning over 4000 error 404.The majority or returned error URLs are for images, and all URLs end up with %20target=as shown belowimages/products/detail/AD9058RoundGlassTableChairs.jpg%20target=images/products/detail/BM921ModernRoundDiningTable.jpg%20target=images/products/detail/CR701506CappuccinoCoffeeTableSet.jpg%20target=any suggestions?RegardsTony
Technical SEO | | OCFurniture0 -
How do crawl errors from SEOmoz tool set effect rankings?
Hello - The other day I presented the crawl diagnostic report to a client. We identified duplicate page title errors, missing meta description errors, and duplicate content errors. After reviewing the report we presented it to the clients web company who operates a closed source CMS. Their response was that these errors are not worthy of fixing and in fact they are not hurting the site. We are having issues getting the errors fixed and I would like your opinion on this matter. My question is, how bad are these errors? Should we not fix them? Should they be fixed? Will fixing the errors have an impact on our site's rankings? Personally, I think the question is silly. I mean, the errors were found using the SEOmoz tool kit, these errors have to be effecting SEO.....right? The attached image is the result of the Crawl Diagnostics that crawled 1,400 pages. NOTE: Most of the errors are coming from Pages like blog/archive/2011-07/page-2 /blog/category/xxxxx-xxxxxx-xxxxxxx/page-2 testimonials/147/xxxxx--xxxxx (xxxx represents information unique to the client) Thanks for your insight! c9Q33.png
Technical SEO | | Gabe0 -
I'm getting a Duplicate Content error in my Pro Dashboard for 2 versions of my Homepage. What is the best way to handle this issue?
Hi SEOMoz,I am trying to fix the final issues in my site crawl. One that confuses me is this canonical homepage URL fix. It says I have duplicate content on the following pages:http://www.accupos.com/http://www.accupos.com/index.phpWhat would be the best way to fix this problem? (...the first URL has a higher page authority by 10 points and 100+ more inbound links).Respectfully Yours,Derek M.
Technical SEO | | DerekM880