Crawl Diagnostics 403 on home page...
-
In the crawl diagnostics it says oursite.com/ has a 403. doesn't say what's causing it but mentions no robots.txt. There is a robots.txt and I see no problems. How can I find out more information about this error?
-
Hi Dana,
Thanks for writing in. The robots.txt file would not cause a 403 error. That type of error is actually related to the way the server responds to our crawler. Basically, this means the server for the site is telling our crawler that we are not allowed to access the site. Here is a resource that explains the 403 http status code pretty thoroughly: http://pcsupport.about.com/od/findbyerrormessage/a/403error.htm
I looked at both of the campaigns on your account and I am not seeing a 403 error for either site, though I do see a couple of 404 page not found errors on one of the campaigns, which is a different issue.
If you are still seeing the 403 error message on one of your crawls, you would just need to have the webmaster update the server to allow rogerbot to access the site.
I hope this helps. Please let me know if you have any other questions.
-Chiaryn
-
Okay, so I couldn't find this thread and started a new one. Sorry...
... The problem persists.
RECAP
I have two blocks in my htaccess both are for amazonaws.com.
I have gone over our server block logs and see only amazon addresses and bot names.
I did a fetch as google with our WM Tools and fetch it did. Success!
Why isn't thiscrawler able to access? Many other bots are crawling right now.
Why can I use the seomoz on-page feature to crawl a single page but the automatic crawler wont access the site? Just took a break from typing this to try the on-page on our robots.txt, worked fine. Use the keyword "Disallow" and it gave me a C. =0)
... now if we could just crawl the rest of the site...
any help on this would be greatly appreciated.
-
I think I do. I just (a few minutes ago) went through a 403 problem being reported by another site trying access an html file for verification. Apparently they are connecting with an ip that's blocked by our htaccess. I removed the blocks told them to try again and it worked no problem. I see that SEOMoz has only crawled 1 page. Off to see if I can trigger a re-crawl now...
-
hmmm... not sure why this is happening. maybe add this line to the top of your robots.txt and see if it fixes by next week. it certainly won't hurt anything:
User-agent: * Allow: /
-
No problem. Looking at my Google WM Tools , crawl stats don't show any errors.
Thanks
User-Agent: *
Disallow: /*?zenid=
Disallow: /editors/
Disallow: /email/
Disallow: /googlecheckout/
Disallow: /includes/
Disallow: /js/
Disallow: /manuals/ -
OH this is only in SEOmoz's crawl diagnostics that you're seeing this error. That explains why robots.txt could be affecting it. I misread this earlier and thought you were finding the 403 on your own in-browser.
Can you paste the robots.txt file into here so we can see it? I would imagine that has everything to do with it now that I've correctly read your post --my apologies
-
apache
-
a 403 is a Forbidden code usually pertaining to Security and Permissions.
Are you running your server in an Apache or IIS environment? Robots.txt shouldn't affect a site's visibility to the public it only talks to site crawlers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved On page grader
All of my keywords score a 53 using the on page grader. When I look at the notes it indicates I don't have the keyword in question anywhere on the page which, while true in some cases, is not always factual. Does anyone have a similar experience?
Moz Pro | | josayoun0 -
Clearing our on-page ranking reports?
Is there a way to "bulk delete" on-page ranking reports which are no longer relevant? I know we can delete them one at a time, but the reason I ask is that I've done a fair bit of work changing URL's, so the reports are often for old URL's which no longer exist. (yes, I made sure to do 301 redirects to the new ones!) Thanks in advance for any help!
Moz Pro | | koalatm0 -
1 page crawled ... and other errors
1. Why is only one (1) page crawled every second time you crawl my site? 2. Why do your bot not obey the rules specified in the robots.txt? 3. Why does your site constantly loose connection to my facebook account/page? This means that when ever i want to compare performance i need to re-authorize, and therefor can not see any data until next time. Next time i also need to re-authorize ... 4. Why cant i add a competitor twitter account? What ever i type i get an "uh oh account cannot be tracked" - and if i randomly succeed, the account added never shows up with any data. It has been like this for ages. If have reported these issues over and over again. We are part of a large scandinavian company represented by Denmark, Sweden, Norway and Finland. The companies are also part of a larger worldwide company spreading across England, Ireland, Continental Europe and Northern Europe. I count at least 10 accounts on Seomoz.org We, the Northern Europe (4 accounts) are now reconsidering our membership at seomoz.org. We have recently expanded our efforts and established a SEO-community in the larger scale businees spanning all our countries. Also in this community we are now discussing the quality of your services. We'll be meeting next time at 27-28th of june in London. I hope i can bring some answers that clarify the problem we have seen here on seomoz.org. As i have written before: I love your setup and you tools - when they work. Regretebly, that is only occasionally the case!
Moz Pro | | alsvik1 -
Crawl test from tools
Hi, I notice that the crawl test which is from the Research Tools doesn't really get a new crawl even though there is 2 crawl per day. It will only provide the data which was acquire from the crawl diagnostics in my pro account. There is no point for me to get the data which I get from my crawl diagnostic isn't it? Even seomoz provided with more than 2 crawl per day also useless in this case. This whole thing doesn't make sense as the crawl diagnostics will only perform a full crawl test once every week. but even the crawl test also not helping any thing out for me.
Moz Pro | | hanzoz0 -
Crawl Diagnostics - Crawling way more pages than my site has?
Hello all, I'm fairly new here, more of a paid search guy dabbling in SEO on the side. I have a client that I have in SEOMoz and the Crawl Diagnostics report is showing 10,000+ pages crawled and I think the site has at most 800 pages (e-commerce site using freewebstore.org as the platform). Any reasons this would be happening?
Moz Pro | | LodestoneGen0 -
Moztool and on page ranking matching
How does the Moztool compare and filter the search phrases you enter in your campaign? Or more correctly, will it filter out stop words or is it an exact match? For example I enter a phrase to track that say: "book ski trip austria" Identified in Google I see that most users search for just that "book ski trip austria" But in content, I cant write that as that is uncorrect english and I want to maby write something like: "When you book a ski trip to austria you get..." How will this affect my on page SEO report, will it still match and mark a "V" in done or show a an error? Even more interesting is, what happen if you do phrases in different order like "An austrian skip trip will make you feel..."
Moz Pro | | Macaper0 -
Too Many On-Page Links
The SeoMoz site crawler says all my pages have too many links. I am using Dreamweaver with a horizontal Spry drop-down menu bar. My site has several hundred pages and about 100 of them show up in this Spry menu bar. I believe that this would be considered a false positive for too many links - am I right? Or is Google seeing this also as too many links per page? I am trying to get my Google rankings back after being hurt badly by the Penguin. I am using php but don't see another way to do the site links without going to a CMS type site. Thanks for any help you can give.
Moz Pro | | johnsearles0 -
How to handle crawl diagnostic errors for the same url. /products & /products/
I have copied on of the errors out of the crawl diagnostics report. Both /products and /products/ are returning an error, and both have pretty good domain authority so I feel like its hurting my site that these show up this way. Both urls create the same page, should I just setup a 301 on the /products with no slash or will that cause more harm... I am using the MODx cms system and that could have something to do with it. | Products | Datalight http://www.datalight.com/products 1 37 5 Products | Datalight http://www.datalight.com/products/ | 1 | 30 | 1 |
Moz Pro | | tjsherrill0