Penalized and how to recover
-
Hey guys
One of our sites was penalised from a while ago and we use to rank page 1 for our keywords and now w can't be found. Is there a way to recover from this?
Also, I came across Rands video about site architecture whilst google were teaching us to generate backlinks, and spammy anchor text what would your suggestion be to recover from this? Any ideas?
Here is the site http://free-love-psychic.com/
Any suggestions?
-
It looks like your robots.txt is missing. If you upload one and Google can read it, they will crawl your site. As the message says, further crawls of your site are "postponed" until Google can read a robots.txt file (which is odd, I've never had this on sites without robots.txt before.)
I would suggest uploading & checking your robots file and then requesting a recrawl through WMT.
-
Hi Matt,
Going through google web masters tools, and finding the messages that were sent to me are as follows:
Over the last 24 hours, Googlebot encountered 30 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.
You can see more details about these errors in Webmaster Tools.
Recommended action If the site error rate is 100%:
Using a web browser, attempt to access http://lovepsychicmary.com/robots.txt. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot. If your robots.txt is a static page, verify that your web service has proper permissions to access the file. If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure.
If the site error rate is less than 100%:
Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors. The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website. If your site redirects to another hostname, another possible explanation is that a URL on your site is redirecting to a hostname whose serving of its robots.txt file is exhibiting one or more of these issues.
After you've fixed the problem, use Fetch as Google to fetch http://lovepsychicmary.com/robots.txt to verify that Googlebot can properly access your site.
So not too sure if this could be the issue?
Learn more in our Help Center.
-
There are a number of different penalties so without specifics this can be hard to answer but by far the most common we see is a partial link penalty. Basically, you built some "crap" so Google chooses to ignore it and not give you the value. Usually results in what looks like a full page, 2 page or 5+ page drop. (Keywords that were page 1 drop to 2 or 3, keywords on 2 drop to page 5, etc.)
You would need a link clean up, maybe disavow & reconsideration request if you have this (or the full link based manual penalty.) If you don't have a message in Webmasters Tools, you may also be talking about Penguin - an "algorithmic penalty" (not a true penalty.) If this is what you mean, similar steps will be taken. In either case you will also need to find a way to continue gaining authority. Disavow / recon without authority building won't get you very far.
-
Hi Alec, recovering from a Google penalty is not an easy process unfortunately. And the Q&A section here wouldn't be enough to explain everything as well. I would suggest going over this article on Moz. It is a very thorough article explaining every steps you need to take to remove manual penalties associated with your website. I hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Positions dropped and do not recover
Hi all, Site: https://www.fascinators.com.au k/w: fascinators (this is the main one I am looking at) About 2 years ago, I redesigned the site. It used to be on first positions in Google Australia. After redesign, positions dropped and never recovered. No matter what I do, I can not get them back. I even involved professional SEO agency for the last 6 month to help build links, but nothing works. (old site used really old non mobile-friendly engine) As far as I know site is technically sound with no issues. In the last few month position for my main keyword 'fascinators' jumps really wildly. From position 9 to not being in Google first 100. And jump can happen overnight. As of today it disappeared from first 100 again. I am really tearing my hair out on what can be wrong. Link building is underway. This is one area I am aware of. Thinking of keyword spamming, i removed lots of 'fascinators' from URL, but that did not seem to help much. Any ideas will be greatly appreciated! Thanks, Rudolf
Intermediate & Advanced SEO | | rudolfl1 -
Recovering from index problem
Hi all. For a while, we've been working on http://thewilddeckcompany.co.uk/. Everything was going swimmingly, and we had a top 5 ranking for the term 'bird hides' for this page - http://thewilddeckcompany.co.uk/products/bird-hides. Then disaster struck! The client added a link with a faulty parameter in the Joomla back end that caused a bunch of duplicate content issues. Before this happened, all the site's 19 pages were indexed. Now it's just a handful, including the faulty URL (<cite>thewilddeckcompany.co.uk/index.php?id=13</cite>) This shows the issue pretty clearly. https://www.google.co.uk/search?q=site%3Athewilddeckcompany.co.uk&oq=site%3Athewilddeckcompany.co.uk&aqs=chrome..69i57j69i58.2178j0&sourceid=chrome&ie=UTF-8 I've removed the link, redirected the bad URL, updated the site map and got some new links pointing at the site to resolve the problem. Yet almost two month later, the bad URL is still showing in the SERPs and the indexing problem is still there. Any ideas? I'm stumped!
Intermediate & Advanced SEO | | Blink-SEO0 -
Why Would This Old Page Be Penalized?
Here's an old page on a trustworthy domain with no apparent negative SEO activity according to OSE and ahrefs: http://www.gptours.com/Monaco-Grand-Prix They went from page 1 to page 13 for "monaco grand prix" within about 4 weeks. Week 2 we pulled out all the duplicate content in the history section. When rank slipped further, we put it back. Yet it's still moving down, while other pages on the website are holding strong. Next steps will be to add some schema.org/Event microformats, but beyond that, do you have any ideas?
Intermediate & Advanced SEO | | stevewiideman0 -
Does google penalize for banner ads?
client has banners on site 1 linking to site 2 and vice-versa but these links are auto generated with each blog post ( new post means new ad link) is this a problem? see, its not like he just has one banner on reach site pointing back to the other - his ads automatically appear on each page that a blog post is posted - im sure you know what i mean but im trying to to my best to make sense here everytime another blog post is created for site 1 then another banner is also created linking to site 2 client thinks it may be a big problem - is it?
Intermediate & Advanced SEO | | Ezpro90 -
Will Google penalize a site that had many links pointing to it with utm codes?
I want to track conversions using utm parameters from guest blog posts on sites other than my own site. Will Google penalize my site for having a bunch of external articles pointing to one page with unique anchor text but utm code? e.g. mysite.com/seo-text?utm_campaign=guest-blogs
Intermediate & Advanced SEO | | wepayinc0 -
Need some urgent Panda advice. Open discussion about recovering from the Panda algorithm.
I have a site that has been affected by Panda, and I think I have finally found the problem. When I created this site in the year 2006, I bought content without checking it. Recently, when I went through the site I found out that this content had many duplicates around the web. Not 100% exact, but close to. The first thing I did is ask my best writer to rewrite these topics, as they are a must on my site. This is a very experienced writer, and she will make the categories and subpages outstanding. Second thing I did was putting a NOINDEX, FOLLOW robots meta in place for the pages I determined being bad. They haven't been de-indexed yet. Another thing I recently did is separate other languages and move these over to other domains (with 301's redirecting the old locations to the new.) This means that the site now has a /en/ directory in the URL which is no longer used. With this in mind I was thinking to relocate the NEW content, and 301 the old (to preserve the juice for a while.) For example: http://www.mysite.com/en/this-is-a-pandalized-page/ 301 to http://www.mysite.com/this-is-the-rewritten-page/ The benefits of doing this are: decreasing the amounts of directories in the URL getting rid of pages that are possibly causing trouble getting fresh pages added to the site Now, the advice I am looking for is basically this: Do you agree with the above? Or don't you agree? If you don't, please be so kind to include a reason with your answer. If you do, and have any additional information, or would like to discuss, please go ahead 🙂 Thanks, Giorgio PS: Is it proven that Panda is now a running update? Or is it still periodically executed?
Intermediate & Advanced SEO | | VisualSense1 -
Examples of sites other than Hubpages that have used subdomains to recover from Panda?
Everyone knows subdomains worked for Hubpages to recover from Panda. Does anyone know of other examples of sites that have recovered from Panda using subdomains?
Intermediate & Advanced SEO | | nicole.healthline0 -
Do sites with a small number of content pages get penalized by Google?
If my site has just five content pages, instead of 25 or 50, then will it get penalized by Google for a given moderately competitive keyword?
Intermediate & Advanced SEO | | RightDirection0