Google Reconsideration - Denied for the Third Time
-
I have been in the process of trying to get past a "link scheme" penalty for just over a year. I took on the client in April 2012, they had received their penalty in February of 2012 before i started.
Since then we have been trying to manually remove links, contact webmasters for link removal, blocking over 40 different domains via the disavow tool and requesting reconsideration multiple times. All i get in return "Site violates Google's quality guidelines."
So we regrouped and did some more research to find that about 90% of the offending spam links pointed to only 3 pages of the website so we decided to just delete the pages, display a 404 error in their place and create new pages with new URLs. At first everything was looking good, the new pages were ranking and receiving page authority and the old pages were gone from the indexes. So we resubmitted for reconsideration for the third time and we got the same exact response!
I don't know what else to do? I did everything i could think of with the exception of deleting the whole site.
Any advice would be greatly appreciated.
Regards - Kyle
-
Kyle, interesting... I knew that is not possible. Do you have a lot of high quality backlinks left or did you start building new ones?
-
Hi Traian - i would have to disagree.
With the advice mentioned above from both Cyrus and Marie i was able to get the penalty lifted and whether it was normal or not my rankings and traffic have bounced back to exactly where they were if not higher than before the penalty.
-
when you removed those backlinks, you don't get to rank as before the penalty. those bad links were keeping you up and now you need to work your way up again, only that this time way, way more carefully than before...
-
Good job Kyle! I would take things one step further though. It is likely not enough to just disavow the domains. If you haven't already done so, make sure that you make efforts to manually get these links removed and then communicate this to Google.
-
Wow, thank you so much for all your guys help!
I just spent all day digging through the full link profile (combined from GWT and OSE) and i have more than doubled my domains in my disavow list. I have been surprised by the amount of domains that have simply expired over the course of this project.
Also, thank you for the pointer on hosting the process files on Google docs for the reconsideration, i was wondering where i would keep that!
I'll keep you all up-to-date as well as contribute to a blog post if you guys would like too.
Thanks again - Kyle
-
Hi Kyle,
First of all, I can't wait for Marie's book! I don't want to recover any ground that she's already gone over, so I'll just share a few thoughts.
1. Has Google verified it's a link penalty? "Site violates Google's quality guidelines" could also refer to on-site issues like hidden text or doorway pages. Given the information you provided, it's most likely a link based penalty, but you never know.
2. Not sure from your description, but I almost always disavow entire domains using the domain: command instead of individual URLs. I've seen requests rejected because they disavowed not enough URLs when they should have blocked the entire domain.
3. I agree with Marie. If you've been penalized, it's generally safer to error on the side of disavowing too many domains than not enough. This isn't to say you should disavow known good links, but if links are questionable, why take a chance.
4. Also agree with Marie on submitting documentation about your removal efforts. Wrote a post about it here: http://cyrusshepard.com/penalty-lifted/
(they tend to like everything in Google Docs files. Cuts down the risk of spam)
5. Minor point, but Google likes everything formatted in a UTF-8 encoded .txt file. I've never seen one rejected because of this, but I hear it happens.
6. I'm turning into a fan of Link Detox for toxic link discovery. Instead of running it in standard mode, upload a file of your complete backlink profile from Webmaster Tools and have Link Detox check those links. Sort the final list by hand - this means check each link! For hints, read Paddy Moogan's post about low quality links: http://www.stateofsearch.com/step-by-step-guide-finding-low-quality-links/
Damn, we should turn this into a blog post!
Hope this helps!
-
You know what Cyrus? I kid you not...this afternoon I suddenly got this thought that I should send you a copy once I got it finished. I know you have been involved in unnatural links cleanup. I'll be in touch!
-
Hi Marie,
Please let us all know when you finish the book!
-
Hey Marie - thanks for the details, i will let you know what else i find!
-
Yes, definitely. These need to be removed if possible and disavowed.
-
I generally create a Google Doc spreadsheet with my links and then have columns where I enter email addresses found on site, whois addresses and url of contact form. Then, I have columns next to those for reference numbers. Those reference numbers refer to a separate document in which I include the original source code of each email sent as well as a document with screenshots of contact forms submitted. It's a pain to do all of this but I have been successful in every single attempt at reconsideration using this method.
If you're interested, I am 95% finished writing a book on the process that I use to get rid of unnatural links penalties. You can contact me via my profile and I can send you my almost finished book at a discounted price. It does include a link to an example of the spreadsheet that I use.
-
Do you think this type of microblogging URL would be considered spam:
http://olcine.com/index.php/steffcolbere
Should i disavow these sites as well?
-
Marie, thanks for the tip on the total disavowed, i am currently in the process of downloading my link profiles from OSE and GWT to look over it again.
As for communicating i haven't submitted a document saying who i have contacted, how would you suggest documenting that? Do you have an example document to share?
-
Nick, what did your request look like when you got approved? An specifics you can share?
-
Thanks for the pointers on # of links in GWT, I will dig in deeper and see the trends over the last few months. As for the reconsideration request, do you have an examples of what people submitted that got approved?
-
Hi Kyle. The process is frustrating, isn't it?
I have a few thoughts for you. You mentioned that you disavowed over 40 different domains. That doesn't sound like many. Many sites that I have worked on have had hundreds and hundreds of domains that needed disavowing. It's possible that you haven't identified enough links as unnatural. In other words, it may be that Google wants you to address some links that you think could be natural but actually do go against the quality guidelines.
I've also seen sites fail at reconsideration because the disavow file was not properly formatted.
How well did you communicate your attempts at link removal to Google? If you have contacted webmasters and failed to get links removed then you need to document that well to Google.
-
I also agree with highland, you have to submit a file that shows all your work.
-
We have recovered from a manual link spam penalty about 2 weeks ago after 5 months of cleaning up stuff. I have cleaned about 85% of the links and submitted a recon. and 3 days later i got a message that said the MANUAL penalty was revoked.
Even though it s been 2 weeks we still have not seen any improvement on the rankings. We are now working on getting quality links .
I felt the same way back in January but kept at it. So clean up more if you can and give it another shot.
good luck I feel your pain.
Nick
-
Make it a point to check Google WMT to see if number of external links is declining. If the number is rising or staying constant, i would check the disavow file to make sure you are indeed capturing all spam domains.
I have found that a great reinclusion request can do the trick. The request should note what your wrongdoings where, what was your remediation, time spent and percentage of success. You should also apologize and your promise to be good.
-
Hmm. Well, the only other thing I could recommend would be reading this post on the matter. To summarize a bit, Google wants to see what you've done to fix the problem. Document what you've done and plead your case.
-
Highland, yes we have utilized it quite heavily with submitting over 40 different FULL domains not just urls.
-
Have you tried the disavow links tool? I know many people who have fought manual penalties and they have expressed that it's invaluable in getting rid of them.
-
I just don't think that is the right move, we still hold rankings for other pages, it just seems to be keyword/page specific some how
-
I know this may not be what you want to hear but it might make sense to start over. New domain and website. To be completely rid of the old site is a hard but necessary move.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google related searches
Hello, Are the related searches, the words that I should use when writing my content. For ex : when I type online spreadsheet in google, in the related searches it list online spreadsheet open source and spreasheet download. Does it means that when writing content I should included those terms in order to be relevant on the keyword online spreadsheet ? because they are considered closely related by google ?
Intermediate & Advanced SEO | | seoanalytics0 -
Should client sub domains appear in Google or not?
I have a client who has created a number of login pages for their clients (eg. clientA.domain.com, clientB.domain.com). They are all password protected. Just wondering if this has any impact on SEO, good or bad?
Intermediate & Advanced SEO | | muzzmoz1 -
Google Not Displaying Rich Snippets
We implemented rich snippets for products some time ago. When viewing our site through a site:xxxx.com on Google, they don't show for every product, despite the fact that they should. I've taken some of the URLs that don't show rich snippets in the SERPs, ran them through Google's testing tool, and they display fine. Not sure what's going wrong here. Any thoughts?
Intermediate & Advanced SEO | | Kingof50 -
Site not indexed in Google UK
This site was moved to a new host by the client a month back and is still not indexed in Google UK if you search for the site directly. www.loftconversionswestsussex.com Webmaster tools shows that 55 pages have been crawled and no errors have been detected. The client also tried the "Fetch as Google Bot" tactic in GWT as well as running a PPC campaign and the site is still not appearing in Google. Any thoughts please? Cheers, SEO5..
Intermediate & Advanced SEO | | SEO5Team0 -
Google local pointing to Google plus page not homepage
Today my clients homepage dropped off the search results page (was #1 for months, in the top for years). I noticed in the places account everything is suddenly pointing at the Google plus page? The interior pages are still ranking. Any insight would be very helpful! Thanks.
Intermediate & Advanced SEO | | stevenob0 -
What is a "good" dwell time?
I know there isn't any official documentation from Google about exact number of seconds a user should spend on a site, but does anyone have any case studies that looks at what might be a good "dwell time" to shoot for? We're looking on integrating an exact time on site into or Google Analytics metrics to count as a 'non-bounce'--so, for example, if a user spends 45 seconds on an article, then, we wouldn't count it as a bounce, since the reader likely read through all the content.
Intermediate & Advanced SEO | | nicole.healthline0 -
Google Local oddity
So I spotted something a little weird... one of my client's Google Local placements in blended results has the domain name - complete with the .com extension appearing where the business name typically appears: Businessxyz.com www. businessxyz .com of Google reviews Has anyone seen this? I setup their Google Places account quite some time ago and used the business name - not the url. I also setup their Google+ and Local page - using the name. None of the page titles on the website contain the url. I simply can not pinpoint where G is pulling this from or why for that matter. All competitors are appearing with business name - only my client has the domain name visible for the particular local search query. Any ideas?
Intermediate & Advanced SEO | | SCW0 -
Canonical links apparently not used by google
hi, I do have an ecommerce website (www.soundcreation.ro) which in the last 3 months had a drop in the SERP. Started to look around in GWT what is happening. Google is reporting a lot of duplicate meta-tags (and meta-titles problem). But 99% of them had already canonical links setted. I tried to optimize my product listings with the new "prev", "next" tags and introduced also the "view-all" canonical link to help Google identify the appropiate product listing pages. SeoMoz is not reporting thos duplicate meta issues. Here is an example of the same page with different links, but with the same common canonical and reported by GWT "duplicate title tag": http://www.soundcreation.ro/chitare-chitari-electroacustice-cid10-pageall/http://www.soundcreation.ro/chitare-chitari-electroacustice-cid10/http://www.soundcreation.ro/chitare-chitari-electroacustice-cid10_999/http://www.soundcreation.ro/chitare-electro-acustice-cid10_1510/What could be the issue?- only that gwt is not refreshing as should be, keeping old errors?- if so, then there is an other serious issue because of why our PR is dropping on several pages?- do we have other problem with the site, which ends up with google penalizing us? Thank you for your ideas!
Intermediate & Advanced SEO | | bjutas0