Google Reconsideration - Denied for the Third Time
-
I have been in the process of trying to get past a "link scheme" penalty for just over a year. I took on the client in April 2012, they had received their penalty in February of 2012 before i started.
Since then we have been trying to manually remove links, contact webmasters for link removal, blocking over 40 different domains via the disavow tool and requesting reconsideration multiple times. All i get in return "Site violates Google's quality guidelines."
So we regrouped and did some more research to find that about 90% of the offending spam links pointed to only 3 pages of the website so we decided to just delete the pages, display a 404 error in their place and create new pages with new URLs. At first everything was looking good, the new pages were ranking and receiving page authority and the old pages were gone from the indexes. So we resubmitted for reconsideration for the third time and we got the same exact response!
I don't know what else to do? I did everything i could think of with the exception of deleting the whole site.
Any advice would be greatly appreciated.
Regards - Kyle
-
Kyle, interesting... I knew that is not possible. Do you have a lot of high quality backlinks left or did you start building new ones?
-
Hi Traian - i would have to disagree.
With the advice mentioned above from both Cyrus and Marie i was able to get the penalty lifted and whether it was normal or not my rankings and traffic have bounced back to exactly where they were if not higher than before the penalty.
-
when you removed those backlinks, you don't get to rank as before the penalty. those bad links were keeping you up and now you need to work your way up again, only that this time way, way more carefully than before...
-
Good job Kyle! I would take things one step further though. It is likely not enough to just disavow the domains. If you haven't already done so, make sure that you make efforts to manually get these links removed and then communicate this to Google.
-
Wow, thank you so much for all your guys help!
I just spent all day digging through the full link profile (combined from GWT and OSE) and i have more than doubled my domains in my disavow list. I have been surprised by the amount of domains that have simply expired over the course of this project.
Also, thank you for the pointer on hosting the process files on Google docs for the reconsideration, i was wondering where i would keep that!
I'll keep you all up-to-date as well as contribute to a blog post if you guys would like too.
Thanks again - Kyle
-
Hi Kyle,
First of all, I can't wait for Marie's book! I don't want to recover any ground that she's already gone over, so I'll just share a few thoughts.
1. Has Google verified it's a link penalty? "Site violates Google's quality guidelines" could also refer to on-site issues like hidden text or doorway pages. Given the information you provided, it's most likely a link based penalty, but you never know.
2. Not sure from your description, but I almost always disavow entire domains using the domain: command instead of individual URLs. I've seen requests rejected because they disavowed not enough URLs when they should have blocked the entire domain.
3. I agree with Marie. If you've been penalized, it's generally safer to error on the side of disavowing too many domains than not enough. This isn't to say you should disavow known good links, but if links are questionable, why take a chance.
4. Also agree with Marie on submitting documentation about your removal efforts. Wrote a post about it here: http://cyrusshepard.com/penalty-lifted/
(they tend to like everything in Google Docs files. Cuts down the risk of spam)
5. Minor point, but Google likes everything formatted in a UTF-8 encoded .txt file. I've never seen one rejected because of this, but I hear it happens.
6. I'm turning into a fan of Link Detox for toxic link discovery. Instead of running it in standard mode, upload a file of your complete backlink profile from Webmaster Tools and have Link Detox check those links. Sort the final list by hand - this means check each link! For hints, read Paddy Moogan's post about low quality links: http://www.stateofsearch.com/step-by-step-guide-finding-low-quality-links/
Damn, we should turn this into a blog post!
Hope this helps!
-
You know what Cyrus? I kid you not...this afternoon I suddenly got this thought that I should send you a copy once I got it finished. I know you have been involved in unnatural links cleanup. I'll be in touch!
-
Hi Marie,
Please let us all know when you finish the book!
-
Hey Marie - thanks for the details, i will let you know what else i find!
-
Yes, definitely. These need to be removed if possible and disavowed.
-
I generally create a Google Doc spreadsheet with my links and then have columns where I enter email addresses found on site, whois addresses and url of contact form. Then, I have columns next to those for reference numbers. Those reference numbers refer to a separate document in which I include the original source code of each email sent as well as a document with screenshots of contact forms submitted. It's a pain to do all of this but I have been successful in every single attempt at reconsideration using this method.
If you're interested, I am 95% finished writing a book on the process that I use to get rid of unnatural links penalties. You can contact me via my profile and I can send you my almost finished book at a discounted price. It does include a link to an example of the spreadsheet that I use.
-
Do you think this type of microblogging URL would be considered spam:
http://olcine.com/index.php/steffcolbere
Should i disavow these sites as well?
-
Marie, thanks for the tip on the total disavowed, i am currently in the process of downloading my link profiles from OSE and GWT to look over it again.
As for communicating i haven't submitted a document saying who i have contacted, how would you suggest documenting that? Do you have an example document to share?
-
Nick, what did your request look like when you got approved? An specifics you can share?
-
Thanks for the pointers on # of links in GWT, I will dig in deeper and see the trends over the last few months. As for the reconsideration request, do you have an examples of what people submitted that got approved?
-
Hi Kyle. The process is frustrating, isn't it?
I have a few thoughts for you. You mentioned that you disavowed over 40 different domains. That doesn't sound like many. Many sites that I have worked on have had hundreds and hundreds of domains that needed disavowing. It's possible that you haven't identified enough links as unnatural. In other words, it may be that Google wants you to address some links that you think could be natural but actually do go against the quality guidelines.
I've also seen sites fail at reconsideration because the disavow file was not properly formatted.
How well did you communicate your attempts at link removal to Google? If you have contacted webmasters and failed to get links removed then you need to document that well to Google.
-
I also agree with highland, you have to submit a file that shows all your work.
-
We have recovered from a manual link spam penalty about 2 weeks ago after 5 months of cleaning up stuff. I have cleaned about 85% of the links and submitted a recon. and 3 days later i got a message that said the MANUAL penalty was revoked.
Even though it s been 2 weeks we still have not seen any improvement on the rankings. We are now working on getting quality links .
I felt the same way back in January but kept at it. So clean up more if you can and give it another shot.
good luck I feel your pain.
Nick
-
Make it a point to check Google WMT to see if number of external links is declining. If the number is rising or staying constant, i would check the disavow file to make sure you are indeed capturing all spam domains.
I have found that a great reinclusion request can do the trick. The request should note what your wrongdoings where, what was your remediation, time spent and percentage of success. You should also apologize and your promise to be good.
-
Hmm. Well, the only other thing I could recommend would be reading this post on the matter. To summarize a bit, Google wants to see what you've done to fix the problem. Document what you've done and plead your case.
-
Highland, yes we have utilized it quite heavily with submitting over 40 different FULL domains not just urls.
-
Have you tried the disavow links tool? I know many people who have fought manual penalties and they have expressed that it's invaluable in getting rid of them.
-
I just don't think that is the right move, we still hold rankings for other pages, it just seems to be keyword/page specific some how
-
I know this may not be what you want to hear but it might make sense to start over. New domain and website. To be completely rid of the old site is a hard but necessary move.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to index your website pages on Google 2020 ?
Hey! Hopefully, everyone is fine here I tell you some step how you are index your all website pages on Google 2020. I'm already implementing these same steps for my site Boxes Maker. Now Below I'm giving you some steps for indexing your website pages. These are the most important ways to help Google find your pages: Add a sitemap. ... Make sure people know your site. ... Ensure full navigation on your site. ... Apply the indexing application to your homepage. ... Sites that use URL parameters other than URLs or page names may be more difficult to broadcast.
Intermediate & Advanced SEO | | fbowable0 -
Interest in optimise Google Crawl
Hello, I have an ecommerce site with all pages crawled and indexed by Google. But I have some pages with multiple urls like : www.sitename.com/product-name.html and www.sitename.com/category/product-name.html There is a canonical on all these pages linking to the simplest url (so Google index only one page). So the multiple pages are not indexed, but Google still comes crawling them. My question is : Did I have any interest in avoiding Google to crawl these pages or not ? My point is that Google crawl around 1500 pages a day on my site, but there are only 800 real pages and they are all indexed on Google. There is no particular issue, so is it interesting to make it change ? Thanks
Intermediate & Advanced SEO | | onibi290 -
Google Places Drop
Hi everyone! I have a client that was ranking very nicely for a number of keywords. In the 5 pack for most of the keywords we were targeting. His account went under review for some unknown reason about 2 months ago. It disappeared from the listing... Then a few weeks ago it became approved again. He is now no longer ranking for any of those keywords. He is ranking for some obscure ones but the money words are gone. Do you think this was due to the review? Some sort of GP update over the last 60 days? All of my other clients are still ranking strong in Google Places. Any ideas?
Intermediate & Advanced SEO | | SeattleJoe0 -
Indexing issue or just time?
Hey guys, When I publish a post on our blog, I notice that it barely shows up in SERPs even if I copy and paste the title verbatim into Google. All my settings in Yoast are correct from what I've seen. Is this just Google slowly getting around to crawling our site? Or is something else wrong here? We recently shut down and relaunched our site about 3 weeks ago. Here is the site URL: The Tech Block
Intermediate & Advanced SEO | | ttb0 -
So What On My Site Is Breaking The Google Guidelines?
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles" I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed) Contacted Google via the site reconsideration and got the general response... So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed. I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons. They are STILL saying my website is breaking the Google guidelines... mainly around links. Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't) Website in question: http://www.yourjigsawpuzzles.co.uk UPDATE: Just to let everyone know that after multiple reconsideration requests, this penalty has been removed. They stated it was a manual penalty. I tried removing numerous different types of links but they kept saying no, it's still breaking rules. It wasn't until I removed some website directory links that they removed this manual penalty. Thought it would be interesting for some of you guys.
Intermediate & Advanced SEO | | RichardTaylor0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
Fixing Google Places once Banned
I have a lot of clients who have somehow botched up their Google Places listing, and now are not showing up in local search results. In one particular case, they were using 2 different Gmail accounts and submitted their listing twice by accident. It appears Google has banned them from local search results. How does one undo steps like this and start fresh? Thanks!
Intermediate & Advanced SEO | | ocsearch0 -
Remove www. in google webmaster
Hi. My baseball blog (mopupduty.com) shows up as www.mopupduty.com in Google Webmaster tools. This is an issue for me, as my Wordpress plug-in sitemap will only show up on http://mopupduty.com/sitemap.xml , not the www. version Is there any way in changing the www. in webmaster tools without deleting my existing index. The website currently has sitelinks in search results, and I'm not too keen in giving them up via deletion. Thanks
Intermediate & Advanced SEO | | mkoster0