Access Denied - 2508 Errors - 403 Response code in webmaster tools
-
Hello Fellow members,
From 9th may I am getting this error messages & these crawl errors is increasing daily. Google is not able to crawl my URLS & getting 403 response code & saying ACCESS Denied Errors in GWT. My all Indexed pages are de-indexed.
Why I am receiving this errors ? My website is working fine but why Google is not able to crawl my pages. PLEASE TELL ME what is the ISSUE, I need to resolve ASAP
on 9th may I got a message in GWT as well for "http://www.mysitename.co.uk/ Increase in authorization permission errors "
Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors.
After this all problem started. Kindly tell what is the issue & how can I solve this.
-
Hi There
Without seeing your website it's hard to tell for sure. But a 403 error usually has to do with permissions (who/what your server will allow to access the content).
Have you recently put anything behind a password?
If you have Screaming Frog SEO Spider you can try setting it to Googlebot as the user agent and try crawling your site.
You can also use a header checker like URI Valet to see what server response is returned. It sounds like Googlebot is getting one response while normal browsers are seeing it fine (200 codes).
If you are absolutely not sure, and can not share your site name, I would contact your webhost to look into any issues with the server.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Errors For Pages That Never Existed
I'm seeing a lot of 404 errors with slugs related to cryptocurrency (not my website's industry at all). We've never created pages remotely similar, but I see a lot of 404 errors with keywords like "bitcoin" and "litecoin". Any recommendations on what to do about this? Another keyword is "yelz". It usually presents like .../yelz/-ripper-vs-steller/ or .../bitcoin-vs-litecoin/. I don't really even have the time to fix all the legitimate 404 errors, let alone these mysterious requests. Any advice is appreciated.
White Hat / Black Hat SEO | | bcaples1 -
Hiding ad code from bots
Hi. I have a client who is about to deploy ads on their site. To avoid bots clicking on those ads and skewing data, the company would like to prevent any bots from seeing any ads and, of course, that includes Googlebot. This seems like it could be cloaking and I'd rather not have a different version of the sites for bots. However, knowing that this will likely happen, I'm wondering how big of a problem it could be if they do this. This change isn't done to manipulate Googlebot's understanding of the page (ads don't affect rankings, etc.) and it will only be a very minimal impact on the page overall. So, if they go down this road and hide ads from bots, I'm trying to determine how big of a risk this could be. I found some old articles discussing this with some suggesting it was a problem and others saying it might be okay in some cases (links below). But I couldn't find any recent articles about this. Wondering if anybody has seen anything new or has a new perspective to share on this issue? Is it a problem if all bots (including Googlebot) are unable to see ads? https://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
White Hat / Black Hat SEO | | Matthew_Edgar
https://www.webmasterworld.com/google/4535445.htm
https://www.youtube.com/watch?v=wBO-1ETf_dY0 -
Can a Self-Hosted Ping Tool Hurt Your IP?
Confusing title I know, but let me explain. We are in the middle of programming a lot of SEO "action" tools for our site. These will be available for users to help better optimize their sites in SERPs. We were thinking about adding a "Ping" tool based in PHP so users can ping their domain and hopefully get some extra attention/speed up indexing of updates. This would be hosted on a subdomain of our site. My question is: If we get enough users using the product, could that potentially get us blacklisted with Google, Bing etc? Technically it needs to send out the Ping request, and that would be coming from the same IP address that our main site is hosted on. If we end up getting over a 1000 users all trying to send ping requests I don't want to potentially jeopardize our IP. Thoughts?
White Hat / Black Hat SEO | | David-Kley0 -
Besides technical error improvement, best way to increase organic traffic to movie review website
I have a friend's website, ShowBizJunkies, that they work very had at improving and providing great content. I put the website in a more modern theme, increased speed (wpengine, but maxed out with cdn, caching, image optimization, etc) But now I'm struggling how to suggest further improving the seo structure or building backlinks. I know trying to come up for those terms like "movie reviews" and many similar are ridiculously difficult, and requires tons of high quality backlinks. What is my lowest hanging fruit here, any suggestions? My current plan is: 1. Fix technical errors 2. Create more evergreen content 3. Work on timing of article release for better Google News coverage 4. More social sharing, sharing on Tumblr, Reddit, Facebook Groups, G+ Communities, etc 5. Build backlinks via outreach to tv show specific sites, movie fan sites, actor fan sites (interviews)
White Hat / Black Hat SEO | | JustinMurray1 -
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
Why should I reach out to webmasters before disavowing links?
Almost all the blogs, and Google themselves, tell us to reach out to webmasters and request the offending links be removed before using Google's Disavow tool. None of the blogs, nor Google, suggest why you "must" do this, it's time consuming and many webmasters don't care and don't act. Why is this a "required" thing to do?
White Hat / Black Hat SEO | | RealSelf0 -
I would like to know if there is a tool to know what keywords
Hi everyone, I am looking for a keywords searcher or a program that can help me to know which keywords my competitors are using. thanks!
White Hat / Black Hat SEO | | lnietob0 -
Penguin Update or URL Error - Rankings Tank
I just redid my site from Godaddy Quick Shopping Cart to Drupal. The site is much cleaner now. I transferred all the content. Now my site dropped from being in the top ten on almost every key word we were targeting to 35+. I "aliased" the urls so that they were the same as the Godaddy site. However when I look at our search results I notice that our URLs have extra wording at the end like this: ?categoryid=1 or some other number. Could this be the reason that our rankings tanked? Previously on the godaddy site the results didnt show this.
White Hat / Black Hat SEO | | chronicle0