Access Denied
-
Our website which was ranking at number 1 in Google.co.uk for our 2 main search terms for over three years was hacked into last November. We rebuilt the site but had slipped down to number 4. We were hacked again 2 weeks ago and are now at number 7.
I realise that this drop may not be just a result of the hacking but it cant' have helped.
I've just access our Google Webmaster Tools accounts and these are the current results:
940 Access Denied Errors
197 Not Found
The 940 Access Denied Errors apply to Wordpress Blog pages.
Is it likely that the hacking caused the Access Denied errors and is there a clear way to repair these errors?
Any advice would be very welcome.
Thanks,
Colin
-
Glad I could help!
-
Thanks so much Jason. I'll take a look right now and be back with a Thumbs Up I'm sure.
Colin
-
Hi Nile
I've found this for you: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2409441
It summarizes why access denied shows up.
My initial thought was that your robots.txt file was maliciously updated to block many of your pages.
-
Hi Jason,
I can visit those pages in Google browser.
Colin
-
A little more information needed here... can you visit those pages in your browser? If not, what error shows up?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Captcha wall to access content and cloaking sanction
Hello, to protect our website against scrapping, visitor are redirect to a recaptcha page after 2 pages visited. But for a SEO purpose Google bot is not included in that restriction so it could be seen as cloaking. What is the best practice in SEO to avoid a penalty for cloaking in that case ?
Intermediate & Advanced SEO | | clementjaunault
I think about adding a paywall Json shema NewsArticle but the content is acceccible for free so it's not a paywall but more a captcha protection wall. What do you recommend ?
Thanks, Describe your question in detail. The more information you give, the better! It helps give context for a great answer.1 -
Site migration/ CMS/domain site structure change-no access to search console
Hi everyone, We are migrating an old site under a bigger umbrella (our main domain). As mentioned in the title, We'll perform CMS migration, domain change, and site structure change. Now, the major problem is that we can't get into google search console for the old site. The site still has old GA code, so google search console verification using this method is not possible, also there is no way developers will be able to add GTM or edit DNS setting (not to bother you with the reason why). Now, my dilemma is : 1. Do we need access to old search console to notify Google about the domain name change or this could be done from our main site (old site will become a part of) search console 2. We are setting up 301 redirects from old to the new domain (not perfect 1:1 redirect ). Once migration is done does anything else needs to be done with the old domain (it will become obsolete)? 3.The main site, Site-map... Should I create a new sitemap with newly added pages or update the current one. 4. if you have anything else please add:) Thank you!
Intermediate & Advanced SEO | | bgvsiteadmin0 -
Video Accessibility and SEO
How do you implement video meta data, closed captioning and transcripts to ensure both search engines and screen readers can crawl/read? For example, in a mostly text-based video with a simple audio track hosted one brightcove and embedded into our site, we want to make sure 1) google can crawl the text on the video and 2) a vision-impaired viewer would be able to use a screen reader to hear the text on the video.
Intermediate & Advanced SEO | | elmorse1 -
Optimizing A Homepage URL That Is Only Accessible To Logged In Users
I have a client who has a very old site with lots and lots of links to it. The site offers www.examplesite.com/loggedin as the homepage to logged in users. So, once you're logged in, you can't get back to examplesite.com anymore (unless you log out) and are instead given /loggedin as your new personalized homepage. The problem is that many users over time who linked to the site linked to the site they saw after they signed up and were logged in.... www.examplesite.com/loggedin. So, there's all these inbound links going to a page that is inaccessible to non-logged-in users. Thus linking to nowheresville. One idea is to fire off a 301 to non-logged in users, forwarding them to the homepage. Thus capturing much of that stranded link juice. Honestly, I'm not 100% sure you can fire off a server code conditioned on if they are logged in or not. I imagine you can, but don't know that for a technical fact. Another idea is to offer some content on /loggedin that is right now mostly currently blank, except for an offer to sign in. Which do you think is better and why? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
Https Homepage Redirect & Issue with Googlebot Access
Hi All, I have a question about Google correctly accessing a site that has a 301 redirect to https on the homepage. Here’s an overview of the situation and I’d really appreciate any insight from the community on what the issue might be: Background Info:
Intermediate & Advanced SEO | | G.Anderson
My homepage is set up as a 301 redirect to a https version of the homepage (some users log in so we need the SSL). Only 2 pages on the site are under SSL and the rest of the site is http. We switched to the SSL in July but have not seen any change in our rankings despite efforts increasing backlinks and out put of content. Even though Google has indexed the SSL page of the site, it appears that it is not linking up the SSL page with the rest of the site in its search and tracking. Why do we think this is the case? The Diagnosis: 1) When we do a Google Fetch on our http homepage, it appears that Google is only reading the 301 redirect instructions (as shown below) and is not finding its way over to the SSL page which has all the correct Page Title and meta information. <code>HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <title>301 Moved Permanently</title> # Moved Permanently The document has moved [here](https://mysite.com/). * * * <address>Apache/2.2.16 (Debian) Server at mysite.com</address></code> 2) When we view a list of external backlinks to our homepage, it appears that the backlinks that have been built after we switched to the SSL homepage have been separated from the backlinks built before the SSL. Even on Open Site, we are only seeing the backlinks that were achieved before we switched to the SSL and not getting to track any backlinks that have been added after the SSL switch. This leads up to believe that the new links are not adding any value to our search rankings. 3) When viewing Google Webmaster, we are receiving no information about our homepage, only all the non-https pages. I added a https account to Google Webmaster and in that version we ONLY receive the information about our homepage (and the other ssl page on the site) What Is The Problem? My concern is that we need to do something specific with our sitemap or with the 301 redirect itself in order for Google to read the whole site as one entity and receive the reporting/backlinks as one site. Again, google is indexing all of our pages but it seems to be doing so in a disjointed way that is breaking down link juice and value being built up by our SSL homepage. Can anybody help? Thank you for any advice input you might be able to offer. -Greg0 -
How to Explain Admin Access To Client Who Denies It to Anyone Inside or Outside?
Hi , I am on a new project for a great prospective client. After a company split the IT dept says, "Our IT Infrastructure team doesn’t provide admin access to anyone (internal or external) to any of our servers/sites for security, maintainability, and quality reasons. We would be more than happy to install and configure any software you would like and provide you read access to any output you might need." I am not aware of any such software. Or what help "read only access" is to getting tasks accomplished. Tomorrow am I meet with four others on the project that I am to assign SEO tasks to. Everyone wonders how to accomplish marketing tasks, web redesign and SEO tasks efficiently? They are told to drop by new content on a backupn CD. My contact has asked that I am permitted access... what can one do when IT says "no?"
Intermediate & Advanced SEO | | jessential0 -
Google Reconsideration - Denied for the Third Time
I have been in the process of trying to get past a "link scheme" penalty for just over a year. I took on the client in April 2012, they had received their penalty in February of 2012 before i started. Since then we have been trying to manually remove links, contact webmasters for link removal, blocking over 40 different domains via the disavow tool and requesting reconsideration multiple times. All i get in return "Site violates Google's quality guidelines." So we regrouped and did some more research to find that about 90% of the offending spam links pointed to only 3 pages of the website so we decided to just delete the pages, display a 404 error in their place and create new pages with new URLs. At first everything was looking good, the new pages were ranking and receiving page authority and the old pages were gone from the indexes. So we resubmitted for reconsideration for the third time and we got the same exact response! I don't know what else to do? I did everything i could think of with the exception of deleting the whole site. Any advice would be greatly appreciated. Regards - Kyle
Intermediate & Advanced SEO | | kchandler0 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1