Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Crawl error robots.txt
-
Hello, when trying to access the site crawl to be able to analyze our page, the following error appears:
**Moz was unable to crawl your site on Nov 15, 2017. **Our crawler was banned by a page on your site, either through your robots.txt, the X-Robots-Tag HTTP header, or the meta robots tag. Update these tags to allow your page and the rest of your site to be crawled. If this error is found on any page on your site, it prevents our crawler (and some search engines) from crawling the rest of your site. Typically errors like this should be investigated and fixed by the site webmaster.
Can help us?
Thanks!
-
@Linda-Vassily yes
-
The page is: https://frizzant.com/ And don't have noindex
-
Thanks Lind and Tawny! i 'll check it
-
Hey there!
This is a tricky one — the answer to these questions is almost always specific to the site and the Campaign. For this Campaign, it looks like your robots.txt file returned a 403 forbidden response to our crawler: https://www.screencast.com/t/f42TiSKp
Do you use any kind of DDOS protection software? That can give our tools trouble and cause us to be unable to access the robots.txt file for your site.
I'd recommend checking with your web developer to make sure that your robots.txt file is accessible to our user-agent, rogerbot, and returning a 200 OK status for that user-agent. If you're still having trouble, it'll be easier to assist you if you contact us through help@moz.com, where we can take a closer look at your account and Campaign directly.
-
I just popped that into ScreamingFrog and I don't see a noindex on that page, but I do see it on some other pages. (Though that shouldn't stop other pages from being crawled.)
Maybe it was just a glitch that happened to occur at the time of the crawl. You could try doing another crawl and see if you get the same error.
-
The page is: http://www.yogaenmandiram.com/ And don't have noindex
-
Hmm. How about on the page itself? Is there a noindex?
-
Yes, our robots.txt it's very simple:
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php -
That just says that you are blocking the Moz crawler. Take a look at your robots.txt file and see if you have any exclusions in there that might cause that page not to be crawled. (Try going to yoursite.com/robots.txt or you can learn more about this topic here.)
-
Sorry, the image don't appear
Try now -
It looks like the error you are referring to did not come through in your question. Could you try editing it?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Crawling only the Home of my website
Hello,
Product Support | | Azurius
I don't understand why MOZ crawl only the homepage of our webiste https://www.modelos-de-curriculum.com We add the website correctly, and we asked for crawling all the pages. But the tool find only the homepage. Why? We are testing the tool before to suscribe. But we need to be sure that the tool is working for our website. If you can please help us.0 -
Unsolved 403 crawl error
Hi, Moz( Also reported by GSC)have reported 403 crawl error on some of my pages. The pages are actually working fine when loaded and no visible issue at all. My web developer told me that some times error issues are reported on a working pages and there is nothing to worry about.
Product Support | | ghrisa65
My question is, will the 403 error have bad consequences on my SEO/Page ranking etc. These are some of the pages that have been reported with 403 error but loading fine: https://www.medistaff24.co.uk/hourly-home-care-in-evesham/ https://www.medistaff24.co.uk/contact-us/0 -
Solved Mozbar Chrome Extension 404 Error
Hello, I am trying to check my website https://whatnumberisiv.com Domain Authority but my Mozbar is not working, When I try to install Mozbar Chrome Extension; it is showing 404 Error. What is the solution of this error? whatnumberisiv.com
Product Support | | Sohail03730 -
Website can't be crawled
Hi there, One of our website can't be crawled. We did get the error emails from you (Moz) but we can't find the solution. Can you please help me? Thanks, Tamara
Product Support | | Yenlo0 -
False 5xx Errors for ColdFusion website
For several years month after month MOZ crawl reports 5xx errors on many pages. Almost every time all the pages work fine as fa as i could see. Google webmaster tools does not notice any errors. Could anyone explain how to fix this situation? Should i get a refund from MOZ?
Product Support | | Elchanan0 -
How to block Rogerbot From Crawling UTM URLs
I am trying to block roger from crawling some UTM urls we have created, but having no luck. My robots.txt file looks like: User-agent: rogerbot Disallow: /?utm_source* This does not seem to be working. Any ideas?
Product Support | | Firestarter-SEO0 -
No crawl data anymore
Using moz quite some time, but I don't have any crawl data anymore. What happened? (www.kbc.be)
Product Support | | KBC
http://analytics.moz.com/settings/campaign/517920.11285160 -
I have removed a subdomain from my main domain. We have stopped the subdomain completely. However the crawl still shows the error for that sub-domain. How to remove the same from crawl reports.
Earlier I had a forum as sub-domain and was mentioned in my main domain. However i have now discontinued the forum and have removed all the links and mention of the forum from my main domain. But the crawler still shows error for the sub-domain. How to make the crawler issues clean or delete the irrelevant crawl issues. I dont have the forum now and no links at the main site, bu still shows crawl errors for the forum which doesnt exist.
Product Support | | potterharry0