What's the best way to eliminate "429 : Received HTTP status 429" errors?
-
My company website is built on WordPress. It receives very few crawl errors, but it do regularly receive a few (typically 1-2 per crawl) "429 : Received HTTP status 429" errors through Moz.
Based on my research, my understand is that my server is essentially telling Moz to cool it with the requests. That means it could be doing the same for search engines' bots and even visitors, right? This creates two questions for me, which I would greatly appreciate your help with:
-
Are "429 : Received HTTP status 429" errors harmful for my SEO? I imagine the answer is "yes" because Moz flags them as high priority issues in my crawl report.
-
What can I do to eliminate "429 : Received HTTP status 429" errors?
Any insight you can offer is greatly appreciated!
Thanks,
Ryan -
-
I have a customer that is using GoDaddy website hosting (at least according to BuiltWith) and I'm experiencing this same issue.
Any updates on this experiment from user rsigg? I'd love to know if I can remove this from my customer's robots file...
FWIW, Netrepid is a hosting provider for colocation, infrastructure and applications (website hosting being considered an application) and we would never force a crawl delay on a Wordpress install!
Not hating on the other hosting service providers... #justsayin
-
I am also on the same hosting and they have not been able to help with the 429. I have now started getting 429 errors when I attempt to login. Definitely something wrong with wp premium hosting.
-
Interesting. I look forward to hearing your results, as my robots.txt file is also set to:
Crawl-delay: 1.
-
We host on Media Temple's Premium WordPress hosting (which I do not recommend, but that's another post for another place), and the techs there told me that it could be an issue with the robots.txt file:
"The issue may be with the settings in the robots.txt file. It looks fine to me but the "Crawl-delay" line might be causing issues. I understand. For the most part, crawlers tend to use robots.txt to determine how to crawl your site, so you may want to see if Moz requires some special settings in there to work correctly."
Ours is set to:
Crawl-delay: 1
I haven't tried changing these values yet in our file, but may experiment with this very soon. If I get results, I'll post back here as well as start a new forum thread.
-
Chase,
They ran a bunch of internal diagnostic tools on my site, and were unable to replicate the 429 errors. They ended up telling me exactly what they told you. I haven't noticed any issues with my site's rankings, or any flags in Webmaster Tools, so it looks like they are right so far. I just hate logging into Moz and seeing all those crawl errors!
-
What'd they say Ryan? Having the same issue and just contacted Godaddy who told me that basically Moz's software is pinging my client's server too frequently so Godaddy is temporarily blocking their IP. They said it's not a concern though as they would never block Google from pinging/indexing the site.
-
Many thanks - I will contact them now!
-
Contact your host and ask let them know about the errors. More than likely they have mod_sec enabled to limit request rates. Ask them to up the limit that you are getting 429 errors from crawlers and you do not want them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Best Practices
My question is regarding the URL structure best practices of a sitemap. My website allows search any number of ways, i.e. 1. http://www.website.com/category/subcategory/product 2. http://www.website.com/subcategory/product 3. http://www.website.com/product However, I am not sure which structure to use in the sitemap (which is being written manually). I know that for SEO purposes the 3rd option is best as the link is more relevant to that individual product, but the Moz tool states that the home page should have less than 100 links (although Google doesn't penalise for having more) and by writing my entire site in the 3rd way it would result in a lot more links adjoining to the home page. It is either the 2nd or 3rd option, I think, as the 1st category is not keyword specific (rather a generic term, i.e. novelties). Does anyone have experience with this?
Moz Pro | | moon-boots0 -
Why can't I see last week's stats?
Can see stats week ending May 1st but nothing after that- and it's May 9th!?
Moz Pro | | locumhunter0 -
403 error for a member site
Perhaps a stupid question but SEOmoz registers 403 errors for pages behind a membersite (ie. they are restricted on purpose). Should I noindex these pages or just let SEOmoz register these "errors"?
Moz Pro | | Crunchii0 -
Open Site Explorer is showing "No Data" for my page titles under the "Top Pages" Tab
This is for my site: www.kibin.com/ Does this mean there is something wrong with how we're titling our pages? We're working on our on-page SEO, titles, meta descriptions this week and getting them up to snuff. I don't understand what is going on here or if it's something to be worried about. Thanks!
Moz Pro | | Kibin0 -
Whats the best way to research website relevant keywords
I wanted to know the general practices SEO marketers use when initially researching and revisiting relevant keywords for a website. I use this process Brain storm a list of keywords, between me and the client Check out competitors websites & SERP's look at Google adwords traffic analyser ( mostly as a point of reference and for increased ideas) How do others do it? what process do you find works - i'm based in the Uk would be great to get a UK perspective. Obviously you can't optimise for all keywords in your list how best is it to decide on the best value or less competitive keywords? I struggle to get stats for local keywords( not enough search data) - most my clients are local businesses and have limited surrounding service areas. thanks
Moz Pro | | Bristolweb0 -
SEOmoz showing crawl errors but webmastertools says no errors, need help!
Hi this is my first question and i couldnt find a similar question on here. basically i have a clients website that is showing 150 duplicate page titles and content errors plus others. SEOmoz analysis is showing me for example is 3 duplicate hompage URLS: 1.www.domain.com 2.domain.com 3.www.domain.com/index.html all 3 are the same page. after explaining to the guy (who built the website) the errors, he ensured me that the main URL is URl 1. and the other 2 are 301 redirects. however SEOmoz analysis doesnt seem to change the results and webmastertools doesnt seem to show any errors at all. also if i try all 3 URL's there are no redirects to URL 1. any help or clarity would be awesome! Thanks e-bob
Moz Pro | | bobsnowzell0 -
You've recently updated your brand rules. We're fetching your new data, and we should have it ready for you within the hour.
Why do i always see this message when entering a certain campaign? "You've recently updated your brand rules. We're fetching your new data, and we should have it ready for you within the hour." I didnt change a thing since i started this campaign two-three weeks ago ...
Moz Pro | | alsvik0 -
Best Tool to Build List of Competitor's Top Keywords?
What is the best (hopefully seomoz) tool to create a list of the keywords bringing the most traffic into a competitor's website? Goal is to build a list of keywords that bring the competitor the most traffic, stack ranked by volume of traffic coming into the competitor.
Moz Pro | | sftravel1