Reconsideration Request a Success!
-
Hi all,
Well I've finally gotten been able to get the penalty removed judging by this email:
"Dear site owner or webmaster of xxx,
We received a request from a site owner to reconsider xxx for compliance with Google's Webmaster Guidelines.
Previously the webspam team had taken manual action on your site because we believed it violated our quality guidelines. After reviewing your reconsideration request, we have revoked this manual action. It may take some time before our indexing and ranking systems are updated to reflect the new status of your site.
Of course, there may be other issues with your site that could affect its ranking without a manual action by the webspam team. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If your site continues to have trouble in our search results, please see this article for help with diagnosing the issue.
Thank you for helping us to maintain the quality of our search results.
Sincerely,
Google Search Quality Team"
This was after a reconsideration request was sent prior to disavow tool being released. In addition I also applied a disavow of all the links I was unsuccessful in removingwithout contacting Google and letting the original reconsideration request run it's course.
I am making this post just to let everyone know that the hard work pays off and Google is just trying to make sure you are doing your best in removing the links. As 'Ryan Kent' always emphasizes, you must really be diligent and honest when trying to remove links. You also need to keep documentation, I anchored contact pages and email addresses, 1st, 2nd, 3rd, and even 4th attempt dates.
Now with the disavow tool, I believe if you do a "good faith" in removing the links, and it is well documented, you can use the disavow tool after multiple attempts, correlating both the disavow links and the spreadsheet sent to Google is and should be very important in a reconsideration request.
Good luck!
Also I received the message from WMT, and wondering does anyone know how long is 'some time' before site is reindexed? So far our organic traffic is still about the same prior. So I would like to hear what other's experience are after a successful reconsideration.
Feel free to ask any questions!
-
I sent a request on Oct 14th and got a response on Oct 18th. I've been removing links around Aug.
Responses whether denial or success seemed to range 1-2 weeks.
-
Interesting but How long have you waited to be reconsidered?
-
Nice work! Hope you get back to the top quickly.
-
Yep, One of the things that have surprised me was previous reconsideration requests that allowed me to communicate directly through email.
It was surprising to actually talk to someone from Google via email about a free service. The emails were pretty personal and one time even gave a link example.
-
That's great to hear - and with it having been manual action that was taken, it's nice to know these are actually handled by real people - as Matt Cutts states here.
Andy
-
Great news William! Good job. I imagine the wait time is going to be at least whatever your normal crawl rate is.
Thanks for sharing your story.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Google rejected my reconsideration request of unnatural link manual action, and list one blog article twice as example?
Hi Moz Community, On April 22 my site received a manual action in Google Webmaster telling me it's caused by unnatural links. After some a deep cleaning of all the sitewide links, which I think is the major problem of my external links, I requested a reconsideration request on May 4. And Google rejected my reconsideration request of unnatural link manual action on May 29, and list one blog article twice as example, which is quite weird to me. Is it normal for Google to list one URL twice as example in the feedback? I don't quite see the reason for that. Does anybody have any idea about that? This is really quite frustrating to me. And to be honest, I don't see much problems about the article Google listed as well. Yeah it's all about our product and it has 3 do-follow links to our site. But it contains no words such as sponsor, advertisement, or rewards... And the blog itself is quite healthy as well. The post also get rather high engagement, with organic comments and shares. How did Google flag that out? I don't think it's possible that Google will go into all our site links one by one... Hope you guys can help me with that. Thanks in advance! Ben
Technical SEO | | Ben_fotor0 -
Anybody having success with Cross-Domain canonical?
Has anyone been using rel="canonical" to attribute content that has been republished on Domain B... back to Domain A, which is the original source? The videos below say that this should be working... I am asking to hear from anyone who has done it. Has it worked as you expected? Did Domain A get the benefit that you expected? Thanks! ========== Source Videos ============= Matt Cutts (April, 2012) http://www.youtube.com/watch?v=zI6L2N4A0hA Matt Cutts (April, 2010) http://www.youtube.com/watch?v=x8XdFb6LGtM Rand Fishkin (August, 2012) http://www.youtube.com/watch?v=O8drPXudZZc
Technical SEO | | EGOL1 -
Has anyone seen direct improvement after April 23 by requesting reinclusion?
Using the open site explorer I have figured out that my former seo agency was buying name spam (mostly Asian sites)for my main keywords and did the same in a private network of blogs. I don't speak any eastern languages and seo Super Dude has left the planet. So... I don't really have much to report to the Google Webmaster folks. How much time - effort- cash do invest in removal requests vs, redo the whole darn site and hope for the best? All the best. Tom
Technical SEO | | tvw1300 -
I always get this error "We have detected that the domain or subfolder does not respond to web requests." I don't know why. PLEASE help
subdomain www.nwexterminating.com subfolder www.nwexterminating.com/pest_control www.nwexterminating.com/termite_services www.nwexterminating.com/bed_bug_services
Technical SEO | | NWExterminating0 -
Google Reconsideration Request (Penguin) - Will Google give links to remove?
When Penguin v1 hit, our site took a hit for a single phrase (i.e. "widgets") due to the techniques our SEO company was using (network). We've since had those links cleaned up, and our rankings have not recovered. Our SEO company said they submitted a reconsideration request on our behalf, and that Google denied it and didn't provide which links we needed removed. Does Google list links that need removing if they are still not happy with your link profile?
Technical SEO | | crucialx0 -
Pages not Indexed after a successful Google Fetch
I am trying to understand why google isn't indexing key content on my site. www.BeyondTransition.com is indexed and new pages show up in a couple of hours. My key content is 6 pages of information for each of 3000 events (driven by mySQL on a wordpress platform). These pages are reached via a search page, but no direct navigation from the home page. When I link to an event page from an indexed page it doesn't show up in search results. When I use fetch on webmaster tools the fetch is successful but is then not indexed - or if it does appear in results it's directed to the internal search page e.g. http://www.beyondtransition.com/site/races/course/race110003/ has been fetched and submitted with links but when I search for BeyondTransition Ironman Cozumel I get these results.... So what have I done wrong and how do I go about fixing it? All thoughts and advice appreciated Thanks Denis
Technical SEO | | beyondtransition0 -
Is the full URL necessary for successful Canonical Links?
Hi, my first question and hopefully an easy enough one to answer. Currently in the head element of our pages we have canonical references such as: (Yes, untidy URL...we are working on it!) I am just trying to find out whether this snippet of the full URL is adequete for canonicalization or if the full domain is needed aswell. My reason for asking is that the SEOmoz On-Page Optimization grading tool is 'failing' all our pages on the "Appropriate Use of Rel Canonical" value. I have been unable to find a definitive answer on this, although admittedly most examples do use the full URL. (I am not the site developer so cannot simply change this myself, but rather have to advise him in a weekly meeting). So in short, presumably using the full URL is best practise, but is it essential to its effectiveness when being read by the search engines? Or could there be another reason why the "Appropriate Use of Rel Canonical" value is not being green ticked? Thank you very much, I appreciate any advice you can give.
Technical SEO | | rmkjersey0