Crawl Diagnostics 403 on home page...
-
In the crawl diagnostics it says oursite.com/ has a 403. doesn't say what's causing it but mentions no robots.txt. There is a robots.txt and I see no problems. How can I find out more information about this error?
-
Hi Dana,
Thanks for writing in. The robots.txt file would not cause a 403 error. That type of error is actually related to the way the server responds to our crawler. Basically, this means the server for the site is telling our crawler that we are not allowed to access the site. Here is a resource that explains the 403 http status code pretty thoroughly: http://pcsupport.about.com/od/findbyerrormessage/a/403error.htm
I looked at both of the campaigns on your account and I am not seeing a 403 error for either site, though I do see a couple of 404 page not found errors on one of the campaigns, which is a different issue.
If you are still seeing the 403 error message on one of your crawls, you would just need to have the webmaster update the server to allow rogerbot to access the site.
I hope this helps. Please let me know if you have any other questions.
-Chiaryn
-
Okay, so I couldn't find this thread and started a new one. Sorry...
... The problem persists.
RECAP
I have two blocks in my htaccess both are for amazonaws.com.
I have gone over our server block logs and see only amazon addresses and bot names.
I did a fetch as google with our WM Tools and fetch it did. Success!
Why isn't thiscrawler able to access? Many other bots are crawling right now.
Why can I use the seomoz on-page feature to crawl a single page but the automatic crawler wont access the site? Just took a break from typing this to try the on-page on our robots.txt, worked fine. Use the keyword "Disallow" and it gave me a C. =0)
... now if we could just crawl the rest of the site...
any help on this would be greatly appreciated.
-
I think I do. I just (a few minutes ago) went through a 403 problem being reported by another site trying access an html file for verification. Apparently they are connecting with an ip that's blocked by our htaccess. I removed the blocks told them to try again and it worked no problem. I see that SEOMoz has only crawled 1 page. Off to see if I can trigger a re-crawl now...
-
hmmm... not sure why this is happening. maybe add this line to the top of your robots.txt and see if it fixes by next week. it certainly won't hurt anything:
User-agent: * Allow: /
-
No problem. Looking at my Google WM Tools , crawl stats don't show any errors.
Thanks
User-Agent: *
Disallow: /*?zenid=
Disallow: /editors/
Disallow: /email/
Disallow: /googlecheckout/
Disallow: /includes/
Disallow: /js/
Disallow: /manuals/ -
OH this is only in SEOmoz's crawl diagnostics that you're seeing this error. That explains why robots.txt could be affecting it. I misread this earlier and thought you were finding the 403 on your own in-browser.
Can you paste the robots.txt file into here so we can see it? I would imagine that has everything to do with it now that I've correctly read your post --my apologies
-
apache
-
a 403 is a Forbidden code usually pertaining to Security and Permissions.
Are you running your server in an Apache or IIS environment? Robots.txt shouldn't affect a site's visibility to the public it only talks to site crawlers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My home page has an "A" rank in Moz and not ranking in Google.
My home page recently dropped from page one in Google to not being ranked for my top keyword. The page has an "A" ranking in MOZ for the keyword. Is there a way to find out the cause. I did have what looked like a duplicate page for a while 3 months ago when a domain was forwarding to my home page incorrectly. Appeared with second domain name instead of primary. Our business has been 95% through internet leads so quite an issue. Is there anyway to find out what is going on.
Moz Pro | | FredRoven0 -
Why did I not get on page analysis
I signed up last Friday for a 30 day trial and was hoping to get the on page analysis so I could get cracking on it. Moz said the crawl was done but I have no info for on page analysis which I find really useful when I have used this before. Any ideas and are any of the MOZ moderators able to get the system to perform one so I can make changes tonight for the MOZ crawl again tomorrow? Thanks Paul
Moz Pro | | ptrobson0 -
No follow links also been reported in SEOmoz crawl diagnostics
Hi, Why does SEOmoz reports links which has been marked as 'nofollow'. I am getting 'Overly-Dynamic URL' reports on links which I have designated as nofollow which means Google will discount them. So why does SEOmoz still report them. Thanks.
Moz Pro | | malpani0 -
Crawl Diagnostics
Hello, I would appreciate your help on the following issue. During Crawl procedure of e-maximos.com (WP installation) I get a lot of errors of the below mentioned categories: Title Missing or Empty & Missing Meta Description Tag for the URLs: http://e-maximos.com/?like_it=xxxx (i.e. xxxx=1033) Any idea of the reason and possible solution. Thank you in advance George
Moz Pro | | gpapatheodorou0 -
Third crawl of my sites back to 250 pages
Hi all, I've been waiting some days for the third crawl of my sites, but SEOMOZ only crawled 277 pages. The next phrase appeared on my crawl report: Pages Crawled: 277 | Limit: 250 My last 2 crawls were of about 10K limit. Any idea? Kind regards, Simon.
Moz Pro | | Aureka0 -
"Duplicate Page Title" and "Duplicate Page Content" issue
Hi I am having an issue with my site showing duplicate page title and content issues for www.domain.com and www.domain.com/ Is the trailing slash really an issue? Can someone help me with a mod_rewrite rule to sort this please? Thanks,
Moz Pro | | JoeBrewer
Joe0 -
How long should the weekly crawl take
Mine started yesterday afternoon and it's now almost 11pm on Sunday. 30+ hours and still not finished (and no progress indicator). 438 pages quoted as being crawled. That's not normal - right? I have made a bunch of changes based on last weeks crawl so I have been eagerly waiting for this to finish But 30 hours?.... Thanks. Mark
Moz Pro | | MarkWill0 -
On Page Optimization Reports - Huh?
I've been working hard to use this EXCELLENT tool for optimize some of what I consider my most important pages . . . But the automatic tool that pulls pages and grades them (the "summary" of the "on page" report) . . . I don't get it. It only graded three of my pages, and I don't understand how it chose what keywords to grade it for? I'm just very confused. I don't understand how it chose the pages to grade, not the words it chose to grade it against. 😞
Moz Pro | | damon12120