Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What's the best way to eliminate "429 : Received HTTP status 429" errors?
-
My company website is built on WordPress. It receives very few crawl errors, but it do regularly receive a few (typically 1-2 per crawl) "429 : Received HTTP status 429" errors through Moz.
Based on my research, my understand is that my server is essentially telling Moz to cool it with the requests. That means it could be doing the same for search engines' bots and even visitors, right? This creates two questions for me, which I would greatly appreciate your help with:
-
Are "429 : Received HTTP status 429" errors harmful for my SEO? I imagine the answer is "yes" because Moz flags them as high priority issues in my crawl report.
-
What can I do to eliminate "429 : Received HTTP status 429" errors?
Any insight you can offer is greatly appreciated!
Thanks,
Ryan -
-
I have a customer that is using GoDaddy website hosting (at least according to BuiltWith) and I'm experiencing this same issue.
Any updates on this experiment from user rsigg? I'd love to know if I can remove this from my customer's robots file...
FWIW, Netrepid is a hosting provider for colocation, infrastructure and applications (website hosting being considered an application) and we would never force a crawl delay on a Wordpress install!
Not hating on the other hosting service providers... #justsayin
-
I am also on the same hosting and they have not been able to help with the 429. I have now started getting 429 errors when I attempt to login. Definitely something wrong with wp premium hosting.
-
Interesting. I look forward to hearing your results, as my robots.txt file is also set to:
Crawl-delay: 1.
-
We host on Media Temple's Premium WordPress hosting (which I do not recommend, but that's another post for another place), and the techs there told me that it could be an issue with the robots.txt file:
"The issue may be with the settings in the robots.txt file. It looks fine to me but the "Crawl-delay" line might be causing issues. I understand. For the most part, crawlers tend to use robots.txt to determine how to crawl your site, so you may want to see if Moz requires some special settings in there to work correctly."
Ours is set to:
Crawl-delay: 1
I haven't tried changing these values yet in our file, but may experiment with this very soon. If I get results, I'll post back here as well as start a new forum thread.
-
Chase,
They ran a bunch of internal diagnostic tools on my site, and were unable to replicate the 429 errors. They ended up telling me exactly what they told you. I haven't noticed any issues with my site's rankings, or any flags in Webmaster Tools, so it looks like they are right so far. I just hate logging into Moz and seeing all those crawl errors!
-
What'd they say Ryan? Having the same issue and just contacted Godaddy who told me that basically Moz's software is pinging my client's server too frequently so Godaddy is temporarily blocking their IP. They said it's not a concern though as they would never block Google from pinging/indexing the site.
-
Many thanks - I will contact them now!
-
Contact your host and ask let them know about the errors. More than likely they have mod_sec enabled to limit request rates. Ask them to up the limit that you are getting 429 errors from crawlers and you do not want them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with Duplicate Content Error
Hello, the result of renewed content appeared in the scan results in my Shopify Store. But these products are unique. Why am I getting this error? Can anyone please help to explain why? screenshot-analytics.moz.com-2021.10.28-19_53_09.png
Moz Pro | | gokimedia0 -
How to overcome Connection Timeout Status Error?
My website contains 110+ pages in which 70 are CONNECTION TIMEOUT while checking in Screaming Frog. Can someone help me in getting this solved? My website Home Page Sanctum Consulting.
Moz Pro | | Manifeat90 -
Should I set blog category/tag pages as "noindex"? If so, how do I prevent "meta noindex" Moz crawl errors for those pages?
From what I can tell, SEO experts recommend setting blog category and tag pages (ie. "http://site.com/blog/tag/some-product") as "noindex, follow" in order to keep the page quality of indexable pages high. However, I just received a slew of critical crawl warnings from Moz for having these pages set to "noindex." Should the pages be indexed? If not, why am I receiving critical crawl warnings from Moz and how do I prevent this?
Moz Pro | | NichGunn0 -
403s: Are There Instances Where 403's Are Common & Acceptable?
Hey All, Both MOZ & Webmaster tools have identified 403 errors on an editorial site I work with (using Drupal CMS). I looked into the errors and the pages triggering the 403 are all articles in draft status that are not being indexed. If I am not logged into our drupal and I try to access an article in draft status I get the 403 forbidden error. Are these 403's typical for an editorial site where editors may be trying to access an article in draft status while they are not logged in? Webmaster tools is showing roughly 350 pages with the 'Access Denied' 403 status. Are these harmful to rank? Thanks!
Moz Pro | | JJLWeber1 -
Best tools for an initial website health check?
Hi,
Moz Pro | | CamperConnect14
I'd like to offer free website health checks (basic audits) and am wondering what tools other people use for this? It would be good to use something that presents the data well. Moz is great but it gets expensive if I want to offer these to many businesses in the hope of taking on just a few as clients and doing a full manual audit for them. So far I've tried seositecheckup.com (just checks a single page though), metaforensics.io and mysiteauditor. Thanks!0 -
How do fix an 803 Error?
I got am 803 error this week on the Moz crawl for one of my pages. The page loads normally in the browser. We use cloudflare. Is there anything that I should do or do I wait a week and hope it disappears? 803 Incomplete HTTP response received Your site closed its TCP connection to our crawler before our crawler could read a complete HTTP response. This typically occurs when misconfigured back-end software responds with a status line and headers but immediately closes the connection without sending any response data.
Moz Pro | | Zippy-Bungle1 -
My site's domain authority is 1\. why is that
Hi Guys My website's domain authority is 1 no matter i try www or non www.. why is that? can you guys please help? Thanks a lot in advance. http://www.opensiteexplorer.org/links?site=autoproject.com.au
Moz Pro | | JazzJack
http://www.opensiteexplorer.org/links?site=www.autoproject.com.au Jazz0 -
Batch lookup domain authority on list of URL's?
I found this site the describes how to use excel to batch lookup url's using seomoz api. The only problem is the seomoz api times out and returns 1 if I try dragging the formula down the cells which leaves me copying, waiting 5 seconds and copying again. This is basically as slow as manually looking up each url. Does anyone know a workaround?
Moz Pro | | SirSud1