How to get rogerbot whitelisted for application firewalls.
-
we have recently installed an application firewall that is blocking rogerbot from crawling our site. Our IT department has asked for an IP address or range of IP addresses to add to the acceptable crawlers. If rogerbot has a dynamic IP address how to we get him added to our whitelist? The product IT is using is from F5 called Application Security Manager.
-
Hi Joel,
Just wondering if you guys came up with a fix for this one? I have the same issue myself...
Thanks!
-
We have a similar issue where we block the Amazon cloudspace but want to allow rogerbot. Our firewall finds it much easier to block by IP range rather than user-agent (rogerbot).
Can you please tell us more about the solution you suggested?
-
Hi, our IP is changing all the time. We do have an option that may resolve your issue though. I'm going to create a ticket as I will need some personal information regarding your account.
Thanks,
Joel
-
Hi kgoss -
I'm not an SEOmoz employee - but just fyi, here are the IP addresses rogerbot used when accessing my site since the start of 2012:
50.17.63.219
184.72.211.58
50.19.77.29
174.129.177.94
184.73.93.243
107.20.64.186
23.22.26.208
107.22.19.89
50.19.188.211
50.19.195.34
107.22.107.114
107.22.137.134
216.244.72.3
75.101.205.127
67.202.42.32
216.176.191.232
23.20.247.207
216.244.72.9
23.22.48.203
216.176.191.234
184.73.13.203
107.20.11.69
50.17.96.67
184.73.137.220
184.73.22.42
184.72.175.148
67.202.13.192
23.21.33.19
107.22.0.173
107.22.157.76
50.16.164.143
50.16.43.91
107.21.143.222
216.176.191.201
107.20.91.74
50.16.88.111
204.236.213.120
184.73.118.141
50.17.38.180
107.21.150.218
216.244.72.10
216.244.72.11
216.244.72.12
23.22.53.56Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do i get the crawler going again?
The initial crawl only hit one page. Set up another campaign for another site and it crawled 260 pages. How can I get the crawler started up again or do I really have to wait a week ?
Moz Pro | | martJ0 -
Rogerbot getting cheeky?
Hi SeoMoz, From time to time my server crashes during Rogerbot's crawling escapades, even though I have a robots.txt file with a crawl-delay 10, now just increased to 20. I looked at the Apache log and noticed Roger hitting me from from 4 different addresses 216.244.72.3, 72.11, 72.12 and 216.176.191.201, and most times whilst on each separate address, it was 10 seconds apart, ALL 4 addresses would hit 4 different pages simultaneously (example 2). At other times, it wasn't respecting robots.txt at all (see example 1 below). I wouldn't call this situation 'respecting the crawl-delay' entry in robots.txt as other question answered here by you have stated. 4 simultaneous page requests within 1 sec from Rogerbot is not what should be happening IMHO. example 1
Moz Pro | | BM7
216.244.72.12 - - [05/Sep/2012:15:54:27 +1000] "GET /store/product-info.php?mypage1.html" 200 77813
216.244.72.12 - - [05/Sep/2012:15:54:27 +1000] "GET /store/product-info.php?mypage2.html HTTP/1.1" 200 74058
216.244.72.12 - - [05/Sep/2012:15:54:28 +1000] "GET /store/product-info.php?mypage3.html HTTP/1.1" 200 69772
216.244.72.12 - - [05/Sep/2012:15:54:37 +1000] "GET /store/product-info.php?mypage4.html HTTP/1.1" 200 82441 example 2
216.244.72.12 - - [05/Sep/2012:15:46:15 +1000] "GET /store/mypage1.html HTTP/1.1" 200 70209
216.244.72.11 - - [05/Sep/2012:15:46:15 +1000] "GET /store/mypage2.html HTTP/1.1" 200 82384
216.244.72.12 - - [05/Sep/2012:15:46:15 +1000] "GET /store/mypage3.html HTTP/1.1" 200 83683
216.244.72.3 - - [05/Sep/2012:15:46:15 +1000] "GET /store/mypage4.html HTTP/1.1" 200 82431
216.244.72.3 - - [05/Sep/2012:15:46:16 +1000] "GET /store/mypage5.html HTTP/1.1" 200 82855
216.176.191.201 - - [05/Sep/2012:15:46:26 +1000] "GET /store/mypage6.html HTTP/1.1" 200 75659 Please advise.1 -
What do you get with mozpoints?
What is the point of collecting mozpoints? I read that you are able to purchase features, but what other perks are there with collecting mozpoints?
Moz Pro | | ReadyArtwork0 -
<bs>Will someone give me a "thumbs up" so I can become an Authority and get my SEOMOZ T-Shirt?</bs>
<bs>I have helped many people (probably 100s) here in the forums. My fingers are swollen and I can't answer questions right now. I want my SEOmoz t-Shirt for becoming an Authority, but I'm a few points shy. Any help would be appreciated.</bs> Watch people give me a thumbs down. LOL
Moz Pro | | Francisco_Meza3 -
Getting SEOMoz reports to ignore certain parameters
I want the SEOMoz reports to ignore duplicate content caused by link-specific parameters being added to URLs (same page reachable from different pages, having marker parameters regarding source page added to the URLs). I can get Google and Bing webmaster tools to ignore parameters I specify. I need to get SEOMoz tools to do it also!
Moz Pro | | SEO-Enlighten0 -
How do I get the Page Authority of individual URLs in my exported (CSV) crawl reports?
I need to prioritize fixes somehow. It seems the best way to do this would be to filter my exported crawl report by the Page Authority of each URL with an error/issue. However, Page Authority doesn't seem to be included in the crawl report's CSV file. Am I missing something?
Moz Pro | | Twilio0 -
How to get seomoz to re-crawl a site?
I had a lot of duplicate content issues and have fixed all the other warnings. I want to check the site again.
Moz Pro | | adamzski0 -
Most of the time getting error.
Hi, i am getting this error most of the time in linkscape since last month. Sorry dude, no inlinks found matching this criteria. Pl guide is this a bug and the sites I am trying to use linkscape for were having lot of pages crawled earlier by SEOMOZ. Thanks, Preet
Moz Pro | | PreetSibia0