Access Denied - 2508 Errors - 403 Response code in webmaster tools
-
Hello Fellow members,
From 9th may I am getting this error messages & these crawl errors is increasing daily. Google is not able to crawl my URLS & getting 403 response code & saying ACCESS Denied Errors in GWT. My all Indexed pages are de-indexed.
Why I am receiving this errors ? My website is working fine but why Google is not able to crawl my pages. PLEASE TELL ME what is the ISSUE, I need to resolve ASAP
on 9th may I got a message in GWT as well for "http://www.mysitename.co.uk/ Increase in authorization permission errors "
Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors.
After this all problem started. Kindly tell what is the issue & how can I solve this.
-
Hi There
Without seeing your website it's hard to tell for sure. But a 403 error usually has to do with permissions (who/what your server will allow to access the content).
Have you recently put anything behind a password?
If you have Screaming Frog SEO Spider you can try setting it to Googlebot as the user agent and try crawling your site.
You can also use a header checker like URI Valet to see what server response is returned. It sounds like Googlebot is getting one response while normal browsers are seeing it fine (200 codes).
If you are absolutely not sure, and can not share your site name, I would contact your webhost to look into any issues with the server.
-Dan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Apparent Bot Queries and Impressions in Webmaster Tools
I've been noticing some strange stats in Google Webmaster Tools for my forum, which has been getting spam queries with impressions and no clicks. See the queries in the attached images. This might be a motive for the spammers or scrapers. I set the date range to just 22 Aug - 22 Nov and I see very obviously the spike is due to impressions. Questions: What should/can I do? Is Google doing something about this? How to avoid this? o6gKB
White Hat / Black Hat SEO | | SameerBhatia0 -
Why There is No link Data Available in my Webmaster Tools even the site has lots of links and webmastert tools account setup properly
i have few account in my webmaster tools that are not showing any link data even the has lots of links. i checked the setup and its everything is good. is some one tell me why there is no data coming through? Thanks
White Hat / Black Hat SEO | | OnlineAssetPartners1 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
Negative SEO and when to use to Dissavow tool?
Hi guys I was hoping someone could help me on a problem that has arisen on the site I look after. This is my first SEO job and I’ve had it about 6 months now. I think I’ve been doing the right things so far building quality links from reputable sites with good DA and working with bloggers to push our products as well as only signing up to directories in our niche. So our backlink profile is very specific with few spammy links. Over the last week however we have received a huge increase in backlinks which has almost doubled our linking domains total. I’ve checked the links out from webmaster tools and they are mainly directories or webstat websites like the ones below | siteinfo.org.uk deperu.com alestat.com domaintools.com detroitwebdirectory.com ukdata.com stuffgate.com | We’ve also just launched a new initiative where we will be producing totally new and good quality content 4-5 times a week and many of these new links are pointing to that page which looks very suspicious to me. Does this look like negative Seo to anyone? I’ve read a lot about the disavow tool and it seems people’s opinions are split on when to use it so I was wondering if anyone had any advice on whether to use it or not? It’s easy for me to identify what these new links are, yet some of them have decent DA so will they do any harm anyway? I’ve also checked the referring anchors on Ahrefs and now over 50% of my anchor term cloud are totally unrelated terms to my site and this has happened over the last week which also worries me. I haven’t seen any negative impact on rankings yet but if this carries on it will destroy my link profile. So would it be wise to disavow all these links as they come through or wait to see if they actually have an impact? It should be obvious to Google that there has been a huge spike in links so then the question is would they be ignored or will I be penalised. Any ideas? Thanks in advance Richard
White Hat / Black Hat SEO | | Rich_9950 -
How to Get Backlinks to a Coupon Code Website
Hello Guys, I run a coupon code website, which by its very nature does not contain the most compelling of content. As you can probably understand, not many people are going to want to link to a page which lists a number of coupons relating to a specific online retailer. I am really struggling to come up with new and innovative ways of attracting links and wondered if anybody was in a similar position to me or could offer some advice. Would love to get some feedback. Thanks!
White Hat / Black Hat SEO | | Marc-FIMA1 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0