Google Manual Penalty - Dilemma?
-
Hi Guys,
A while back, my company had a 'partial match' manual penalty from google for 'unnatural links' pointing to our site.
This glorious feat was accomplished by our previous SEO agency for quite heavily spamming links (directories, all kinds of low quality sites).
That being said, when the penalty hit we really didnt see any drop in traffic. In fact, it was not long after the penalty that we launched a new website and since our traffic has grown quite significantly. we've doubled our total visits from prior penalty to now.
This previous SEO also did submit a couple of reconsideration requests (both done loosely as to fool Google by only removing a small amount of links, then abit more the next time when it failed - this was obviously never going to work). Since then, I myself have submitted a reconsideration request which was very thorough, disavowing 85 Domains (every single one at domain level rather than the individual URLs as I didnt want to take any chances), as well as getting a fair few links removed from when the webmaster responded. I documented this all and made multiple contacts to the webmasters so i could show this to Google.
This reconsideration request was not successful - Google made some new backlinks magically appear that i had not seen previously. But really, my main point is; am I going to do more damage removing more and more links in order to remove the penalty, because as it stands we haven't actually noticed any negative effects from the penalty! Perhaps the negative effects have not been noticed due to the fact that not long after the penalty, we did get a new site which was much improved and therefore would naturally get much more traffic than the old site, but overall it has not been majorly noticed.
What do you guys think - is it worth risking drop in rankings to remove the penalty so we don't face any future issues, or should I not go too heavy with the link removal in order to preserve current rankings? (im really interested to see peoples views on this, so please leave a comment if you can help!)
-
That's the problem...it's often hard to tell whether a link is natural or not. For example, a local directory listing might be ok, but it could be unnatural. If it helps, I wrote a Moz article that describes different kinds of unnatural links: http://moz.com/ugc/what-is-an-unnatural-link-an-in-depth-look-at-the-google-quality-guidelines
-
Thanks for your response, you've clarified a lot for me here.
Essentially, so long as only the unnatural links are removed I should not harm my sites ranking?That is, so long as Google agree on which links are the unnatural ones!
I better get to work auditing all of these links - see you again in afew years! haha.
-
"Google made some new backlinks magically appear that i had not seen previously."
This made me chuckle. Google is a strange animal. John Mueller has said many times that looking at your links in Webmaster Tools is enough, but I will often get back example unnatural links that are not in Webmaster Tools. This is one of the reasons why when I do a backlink audit I combine links from a number of different sources including OSE, ahrefs and majestic.
Now, I have seen sites lift penalties by just going on their Webmaster Tools links but really it's best to get them from multiple sources.
BUT...even when I combine every possible source I can find I will quite often get example links back from Google that don't exist on ANY backlink checkers. These are tough. But usually they are clues that can help you to find more links. For example, often when this happens it's a scraped version of a press release that is given. What I'll do is take a chunk of text in quotes and search for it on Google and often I'll find 3-4 additional links that weren't in my audit list.
Another thing you can do is download new links from GWT as often new ones will pop up even if they are years old.
Are you going to do more harm to your site than good? That depends on how good you are at auditing links. If you're only getting rid of unnatural links then you won't hurt your site and you may even see an improvement in rankings either immediately, a few weeks after the penalty is lifted, or when Penguin refreshes. But, if you're guessing at your disavow decisions then yes, if you disavow good links you're going to do harm to your site.
Best of luck!
-
Keep doing what you're doing. As long as you know how to properly identify if a site/link is good or bad, you shouldn't hurt your site. Better to do this work now and prevent another penalty in the future than to put it off.
RE: total backlinks - I recommend combining and deduping Open Site Explorer, Webmaster Tolls, Majestic, and AHREFs for the most thorough picture.
-
It will often take multiple requests for Google to remove a manual penalty to ensure you put enough effort in to cleaning up your link profile.
What tools did you use to find your links? It's best to use a combination of tools to find all of the possible links to your site. The amount of links you remove/disavow is relative to the size of your link profile, some sites have had to remove or disavow 1,000s of domains.
Ensure the links that you remove are exact match links or those from directories and guest blogging etc.
It's best to remove more links than not enough as even having poor links will result in Google marking you down. If you're not thorough enough, there's every chance you could get penalized again in the future. Also make sure your recon request is clear and simple and clearly demonstrates the work you have done to remove or disavow any offending links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalty
Hi, I wonder if a subdomain gets penalty from Google. Is there any risk that the domain gets penalty?
White Hat / Black Hat SEO | | Rob_I1 -
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
Pleasing the Google Gods & Not DeIndexing my site.
Hey Mozzers, So plenty of you who follow these threads have come across my posts and have read bits and pieces of the strange dark dark gray hat webspace that I have found myself in. So I'm currently doing some research and I wanted all of your opinion too. Will Google always notify you before they stop indexing your website? Will Google always allow you back if you do get pulled? Does Google give a grace period where they say "fix in 30 days?"? What is every bodies experience with all of this?
White Hat / Black Hat SEO | | HashtagHustler0 -
Website starts ranking on Google then always drops - Targeted for Australia but most traffic from U.S - Bounce Rate at 94.49% - HELP!
Hi everyone, Thank you for your time. During the past 8 months I have been working on this website which is a .com.au . I have fully optimised the website which is targeting Brisbane in Australia and I have setup everything (Sitemaps, Geo location on WMT, Fetched as Google etc..) However the website just does not want to rank at all. I know that the previous SEO company were not too good but since then I have disavowed all unnatural links, we have moved the hosting to a new company and the website content has been updated. Only recently the Website has started ranking for it's brand name (not even in top of Google) and whenever a keyword starts ranking above the Top 50 of Google it suddenly drops again. The other issues is that even if I have setup the website to target Australia the majority of traffic comes from the U.S. Last month out of the 127 Session - 85 from United States - 29 from Australia - 3 Brazil - 2 India - 2 Italy - 1 Canada etc... Because of this the website has a Bounce rate of 95%. If you would have any advice, tips or recommendations that I could do to try and fix this it would be much appreciated. I suppose we can consider this as some kind of penalisation - potentially due to the past work and issues that occurred before the business became our client but I am not sure what more I can do to stop the wrong traffic and improve the rankings. Thanks for your help. Lyam
White Hat / Black Hat SEO | | AlphaDigital20 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Where can i see ejemple of disavow files to adapt mine in order to send to google
Can i send a disavow file to google as CSV file. Where can i see ejemple of disavow files to adapt mine in order to send to google
White Hat / Black Hat SEO | | maestrosonrisas0 -
Link Building after Google updates!
Hello All, I just wanted to ask the question to start a discussion on link building after the Google Updates. I haven't been very proactive lately with regards to link building due to the updates and not wanting to get penalised! Are there any link building trends/techniques people are using since the changes? Thanks, seo_123
White Hat / Black Hat SEO | | TWPLC_seo0