Access Denied
-
Our website which was ranking at number 1 in Google.co.uk for our 2 main search terms for over three years was hacked into last November. We rebuilt the site but had slipped down to number 4. We were hacked again 2 weeks ago and are now at number 7.
I realise that this drop may not be just a result of the hacking but it cant' have helped.
I've just access our Google Webmaster Tools accounts and these are the current results:
940 Access Denied Errors
197 Not Found
The 940 Access Denied Errors apply to Wordpress Blog pages.
Is it likely that the hacking caused the Access Denied errors and is there a clear way to repair these errors?
Any advice would be very welcome.
Thanks,
Colin
-
Glad I could help!
-
Thanks so much Jason. I'll take a look right now and be back with a Thumbs Up I'm sure.
Colin
-
Hi Nile
I've found this for you: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2409441
It summarizes why access denied shows up.
My initial thought was that your robots.txt file was maliciously updated to block many of your pages.
-
Hi Jason,
I can visit those pages in Google browser.
Colin
-
A little more information needed here... can you visit those pages in your browser? If not, what error shows up?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge spike in "access denied" in search console
Hey Guys, We have seen a huge spike in "Access Denied" status in the google search console for our website and I have no idea why that would be the case. Is there anyone that can shed some light on what is going on or who can point me in the direction of an SEO specialist that we can pay to fix the issue?? Thanks denied.png
Intermediate & Advanced SEO | | fbchris0 -
Web accessibility - High Contrast web pages, duplicate content and SEO
Hi all, I'm working with a client who has various URL variations to display their content in High Contrast and Low Contrast. It feels like quite an old way of doing things. The URLs look like this: domain.com/bespoke-curtain-making/ - Default URL
Intermediate & Advanced SEO | | Bee159
domain.com/bespoke-curtain-making/?style=hc - High Contrast page
domain.com/bespoke-curtain-making/?style=lc - Low Contrast page My questions are: Surely this content is duplicate content according to a search engine Should the different versions have a meta noindex directive in the header? Is there a better way of serving these pages? Thanks.0 -
Why are "noindex" pages access denied errors in GWT and should I worry about it?
GWT calls pages that have "noindex, follow" tags "access denied errors." How is it an "error" to say, "hey, don't include these in your index, but go ahead and crawl them." These pages are thin content/duplicate content/overly templated pages I inherited and the noindex, follow tags are an effort to not crap up Google's view of this site. The reason I ask is that GWT's detection of a rash of these access restricted errors coincides with a drop in organic traffic. Of course, coincidence is not necessarily cause. Should I worry about it and do something or not? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Https Homepage Redirect & Issue with Googlebot Access
Hi All, I have a question about Google correctly accessing a site that has a 301 redirect to https on the homepage. Here’s an overview of the situation and I’d really appreciate any insight from the community on what the issue might be: Background Info:
Intermediate & Advanced SEO | | G.Anderson
My homepage is set up as a 301 redirect to a https version of the homepage (some users log in so we need the SSL). Only 2 pages on the site are under SSL and the rest of the site is http. We switched to the SSL in July but have not seen any change in our rankings despite efforts increasing backlinks and out put of content. Even though Google has indexed the SSL page of the site, it appears that it is not linking up the SSL page with the rest of the site in its search and tracking. Why do we think this is the case? The Diagnosis: 1) When we do a Google Fetch on our http homepage, it appears that Google is only reading the 301 redirect instructions (as shown below) and is not finding its way over to the SSL page which has all the correct Page Title and meta information. <code>HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <title>301 Moved Permanently</title> # Moved Permanently The document has moved [here](https://mysite.com/). * * * <address>Apache/2.2.16 (Debian) Server at mysite.com</address></code> 2) When we view a list of external backlinks to our homepage, it appears that the backlinks that have been built after we switched to the SSL homepage have been separated from the backlinks built before the SSL. Even on Open Site, we are only seeing the backlinks that were achieved before we switched to the SSL and not getting to track any backlinks that have been added after the SSL switch. This leads up to believe that the new links are not adding any value to our search rankings. 3) When viewing Google Webmaster, we are receiving no information about our homepage, only all the non-https pages. I added a https account to Google Webmaster and in that version we ONLY receive the information about our homepage (and the other ssl page on the site) What Is The Problem? My concern is that we need to do something specific with our sitemap or with the 301 redirect itself in order for Google to read the whole site as one entity and receive the reporting/backlinks as one site. Again, google is indexing all of our pages but it seems to be doing so in a disjointed way that is breaking down link juice and value being built up by our SSL homepage. Can anybody help? Thank you for any advice input you might be able to offer. -Greg0 -
Can I make 301 redirects on a Windows server (without access to IIS)?
Hey everyone, I've been trying to figure out a way to set up some 301 redirects to handle the broken links left behind after a site restructuring, but I can only ever find information on 2 methods that I can't use (as far as I can tell). The first method is to do some stuff with an htaccess file, but that looks like it only works on Linux-based servers. The method described for Windows servers is generally to install this IIS rewrite/redirect module and run that, but I don't think our web hosting company allows users to log directly into the server, so I wouldn't be able to use the IIS thing. Is there any other way to get a 301 redirect set up? And is this uncommon for a web hosting company to do, or do you all just run your sites on Linux-based servers or your own Windows machines? Thanks!
Intermediate & Advanced SEO | | BrianAlpert780 -
HEADS UP - Did Google Grant WMT and GA admin access to your past employees or contractors?
Check your users and permissions in WMT and GA. I noticed that two Gmail accounts from a while back were given admin access to our accounts! That means someone that used to work for you could go in and remove your site from Googles index. Check your accounts folks just a heads up 😉 Here is an article talking about this potentially dangerous issue. http://thenextweb.com/google/2012/11/28/serious-google-security-glitch-gives-webmaster-tools-possibly-analytics-access-to-revoked-accounts
Intermediate & Advanced SEO | | irvingw1 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1