Reconsideration Request a Success!
-
Hi all,
Well I've finally gotten been able to get the penalty removed judging by this email:
"Dear site owner or webmaster of xxx,
We received a request from a site owner to reconsider xxx for compliance with Google's Webmaster Guidelines.
Previously the webspam team had taken manual action on your site because we believed it violated our quality guidelines. After reviewing your reconsideration request, we have revoked this manual action. It may take some time before our indexing and ranking systems are updated to reflect the new status of your site.
Of course, there may be other issues with your site that could affect its ranking without a manual action by the webspam team. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If your site continues to have trouble in our search results, please see this article for help with diagnosing the issue.
Thank you for helping us to maintain the quality of our search results.
Sincerely,
Google Search Quality Team"
This was after a reconsideration request was sent prior to disavow tool being released. In addition I also applied a disavow of all the links I was unsuccessful in removingwithout contacting Google and letting the original reconsideration request run it's course.
I am making this post just to let everyone know that the hard work pays off and Google is just trying to make sure you are doing your best in removing the links. As 'Ryan Kent' always emphasizes, you must really be diligent and honest when trying to remove links. You also need to keep documentation, I anchored contact pages and email addresses, 1st, 2nd, 3rd, and even 4th attempt dates.
Now with the disavow tool, I believe if you do a "good faith" in removing the links, and it is well documented, you can use the disavow tool after multiple attempts, correlating both the disavow links and the spreadsheet sent to Google is and should be very important in a reconsideration request.
Good luck!
Also I received the message from WMT, and wondering does anyone know how long is 'some time' before site is reindexed? So far our organic traffic is still about the same prior. So I would like to hear what other's experience are after a successful reconsideration.
Feel free to ask any questions!
-
I sent a request on Oct 14th and got a response on Oct 18th. I've been removing links around Aug.
Responses whether denial or success seemed to range 1-2 weeks.
-
Interesting but How long have you waited to be reconsidered?
-
Nice work! Hope you get back to the top quickly.
-
Yep, One of the things that have surprised me was previous reconsideration requests that allowed me to communicate directly through email.
It was surprising to actually talk to someone from Google via email about a free service. The emails were pretty personal and one time even gave a link example.
-
That's great to hear - and with it having been manual action that was taken, it's nice to know these are actually handled by real people - as Matt Cutts states here.
Andy
-
Great news William! Good job. I imagine the wait time is going to be at least whatever your normal crawl rate is.
Thanks for sharing your story.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Time to deindexing: WMT Request vs. Server not found
Google indexed some subdomains (13!) that were never supposed to exist, but apparently returned a 200 code when Google somehow crawled them. I can get these subdomains to return a "server not found" error by turning off wildcard subdomains at my DNS. I've been told that these subdomains will be deindexed just from this server not found error. I was going to use Webmaster Tools and verify each domain, but I'm on an economy goDaddy server and apparently subdomains just get forwarded to a directory, so subdomain.domain.com gets redirected to domain.com/subdomain. I'm not even sure with this being the case, if I can get WMT to recognize and remove these subdomains like that. Should I fret about this, or will the "server not found" message get Google to remove these soon enough?
Technical SEO | | erin_soc0 -
When to re-submit for reconsideration?
Hi! We received a manual penalty notice. We had an SEO company a couple of years ago build some links for us on blogs. Currently we have only about 95 of these links which are pretty easily identifiable by the anchor text used and the blogs or directories they originate from. So far, we have seen about 35 of those removed and have made 2 contacts to each one via removeem.com. So, how many contacts do you think need to be made before submitting a reconsideration request? Is 2 enough? Also, should we use the disavow tool on these remaining 65 links? Every one of the remaining links is from either a filipino blog page or a random article directory. Finally, do you think we are still getting juice from these links? i.e. if we do remove or disavow these anchor text links are we actually going to see a negative impact? Thanks for your help and answers!! Craig
Technical SEO | | TheCraig0 -
I always get this error "We have detected that the domain or subfolder does not respond to web requests." I don't know why. PLEASE help
subdomain www.nwexterminating.com subfolder www.nwexterminating.com/pest_control www.nwexterminating.com/termite_services www.nwexterminating.com/bed_bug_services
Technical SEO | | NWExterminating0 -
Site Recovered from hack, should I submit a reinclusion request?
Hello, The site i'm referring to is http://www.pokeronamac.com, it was hacked via something called the "WordPress Pharma Hack" http://theblawblog.wordpress.com/2012/06/21/restoring-a-pharma-hacked-wordpress-site-wp-3-4/ We restored it as far as I can tell, but if anyone can confirm this by doing a site search and not getting redirected it would be appreciated. You will see that some search results still show up as spam, but when I click on them, they 404. I want to know If I should submit a reinclusion request, I wasn't notified by WMT of malaware, so I want to know the SOP here. Thanks Zach
Technical SEO | | Zachary_Russell1 -
Pages not Indexed after a successful Google Fetch
I am trying to understand why google isn't indexing key content on my site. www.BeyondTransition.com is indexed and new pages show up in a couple of hours. My key content is 6 pages of information for each of 3000 events (driven by mySQL on a wordpress platform). These pages are reached via a search page, but no direct navigation from the home page. When I link to an event page from an indexed page it doesn't show up in search results. When I use fetch on webmaster tools the fetch is successful but is then not indexed - or if it does appear in results it's directed to the internal search page e.g. http://www.beyondtransition.com/site/races/course/race110003/ has been fetched and submitted with links but when I search for BeyondTransition Ironman Cozumel I get these results.... So what have I done wrong and how do I go about fixing it? All thoughts and advice appreciated Thanks Denis
Technical SEO | | beyondtransition0 -
Reconsideration Request
I've been cleaning up the back link profiles for a certain page on our site, my question is once I'm happy with the new link profile and I want to submit the URL for reconsideration can I submit just one URL or will Google take a look through the entire site?
Technical SEO | | DanHill0 -
Is the full URL necessary for successful Canonical Links?
Hi, my first question and hopefully an easy enough one to answer. Currently in the head element of our pages we have canonical references such as: (Yes, untidy URL...we are working on it!) I am just trying to find out whether this snippet of the full URL is adequete for canonicalization or if the full domain is needed aswell. My reason for asking is that the SEOmoz On-Page Optimization grading tool is 'failing' all our pages on the "Appropriate Use of Rel Canonical" value. I have been unable to find a definitive answer on this, although admittedly most examples do use the full URL. (I am not the site developer so cannot simply change this myself, but rather have to advise him in a weekly meeting). So in short, presumably using the full URL is best practise, but is it essential to its effectiveness when being read by the search engines? Or could there be another reason why the "Appropriate Use of Rel Canonical" value is not being green ticked? Thank you very much, I appreciate any advice you can give.
Technical SEO | | rmkjersey0