Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unsolved The Moz.com bot is overloading my server
-
. How to solve it?
-
For a step-by-step guide on setting up the tool, check out the solara executor download tutorial.
-
maybe crawl delay will help.
-
@paulavervo
Hi,
We do! The best way to chat with us is via our contact form or direct email. We also have chat within Moz Pro.
Please contact us via help@moz.com or https://moz.com/help/contact
We will be happy to help.
Cheers,
Kerry. -
very nice brother i like it very good keep it up !
-
very nice !
-
does the moz team even monitor this forum?
-
If the Moz.com bot is overloading your server, there are several steps you can take to manage and mitigate the issue effectively. First, you can adjust the crawl rate in your
robots.txt
file by specifying a crawl delay for the Moz bot using directives likeUser-agent: rogerbot
andUser-agent: dotbot
, followed byCrawl-delay: 10
to make the bot wait 10 seconds between requests. If this does not suffice, you can temporarily block the bot by disallowing it in yourrobots.txt
file. Additionally, it's a good idea to contact Moz’s support team to explain the issue, as they may offer solutions to adjust the crawl rate for your site. Implementing server-side rate limiting is another effective strategy. For Apache servers, you can add rules in your.htaccess
file to return a 429 Too Many Requests status code to the Moz bots, while for Nginx servers, you can set up rate limiting in your configuration file to control the number of requests per second from a single user or IP address. Monitoring your server’s performance and log files can help identify specific patterns or peak times, allowing you to fine-tune your settings. Furthermore, using a Content Delivery Network (CDN) can help distribute the load by caching content and serving it from multiple locations, reducing the direct impact on your server caused by crawlers. By taking these steps, you can manage the load from the Moz.com bot and maintain your server’s stability and responsiveness.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rogerbot directives in robots.txt
I feel like I spend a lot of time setting false positives in my reports to ignore. Can I prevent Rogerbot from crawling pages I don't care about with robots.txt directives? For example., I have some page types with meta noindex and it reports these to me. Theoretically, I can block Rogerbot from these with a robots,txt directive and not have to deal with false positives.
Reporting & Analytics | | awilliams_kingston0 -
Robots.txt file issues on Shopify server
We have repeated issues with one of our ecommerce sites not being crawled. We receive the following message: Our crawler was not able to access the robots.txt file on your site. This often occurs because of a server error from the robots.txt. Although this may have been caused by a temporary outage, we recommend making sure your robots.txt file is accessible and that your network and server are working correctly. Typically errors like this should be investigated and fixed by the site webmaster. Read our troubleshooting guide. Are you aware of an issue with robots.txt on the Shopify servers? It is happening at least twice a month so it is quite an issue.
Moz Pro | | A_Q0 -
What's the best way to search keywords for Youtube using Moz Keyword explorer?
I want to optimize my youtube channel using identified keywords, but I'm concerned that the keywords I'm identifying work well for SERP's but might not be how people search in Youtube. How do a distinguish my keywords to be targeted for Youtube?
Moz Pro | | Dustless0 -
MoZ vs SEMRush - Keyword Difficulty and Keyword With Low Competition.
Hi, My question is very focussed regarding these 2 tools. Is it correct to understand that MoZ tells keyword difficulty and not which keyword is easy to compete. And SEMRush tells which keyword is easy to compete but does not tell which keywords are difficult to compete? I mean one (Moz) misses the one part and the other (SEMRush) misses the other part. Hope some will enlighten me to the point. Best
Moz Pro | | Sequelmed0 -
Screaming frog, Xenu, Moz giving wrong results
Hello guys and gals, This is a very odd one, I've a client's website and most of the crawlers I'm using are giving me weird/ wrong results. For now lets focus on screaming frog, when I crawl the site it will list e.g. meta titles as missing (not all of them though), however going into the site the title is not missing, and Google seems to be indexing the site fine. The robots.txt are not affecting the site (I've also tried changing the user agent). The other odd thing is SF gives a 200 code but as a status tells me "connection refused" even though it's giving me data. I'm unable to share the clients site, has any one else seen this very odd issue? And solutions for it? Many thanks in advanced for any help,
Moz Pro | | GPainter0 -
What does moz trust means?
Hi guys Moz toolbar show me my 'mT' of index page of my website is 7.07. Is it good?
Moz Pro | | vahidafshari450 -
Seomoz Spider/Bot Details
Hi All Our website identifies a list of search engine spiders so that it does not show them the session ID's when they come to crawl, preventing the search engines thinking there is duplicate content all over the place. The Seomoz has bought a over 20k crawl errors on the dashboard due to session ID's. Could someone please give the details for the Seomoz bot so that we can add it to the list on the website so when it does come to crawl it won't show it session ID's and give all these crawl errors. Thanks
Moz Pro | | blagger1