What is the average response time for Reconsideration request
-
I know that Google states 'several' weeks but just wondering if anybody has any experience with a Reconsideration request and if they got any type of reply and what their general experience was.
thanks
-
It took 2 weeks for us a few months ago, but we were simply told that the site did not have a penalty.
We made some changes both internally and created some better links and we were back up the rankings,
-
Hey Barry
Having a bit of insight into your problem from our email discussion, I think you will find that making the changes will be enough and your site will just pop out of the filter when the problems are resolved.
I may be wrong, it's a one way flow of information with Google on this so definitely make the request but expect anything up to the seven weeks. Though, even post panda when i guess they were getting hammered, one site I helped got responded to in around 4 weeks so... 3 is a good bet.
Cheers
Marcus
-
I recently filed a reconsideration request and got a response within a week. It was a standard form letter and appeared only in WMT (not in an email to me). In that particular case my reconsideration request was not approved. After I filed another request a few days later, it took about 3 weeks to receive the next response.
-
Actually I will correct my original response.
On Mar 22, I filed a reconsideration request. The site involved had received a manual penalty from Google which had previously been confirmed by Google in writing.
On April 2nd, I received a response from Google via WMT titled "We've processed your reconsideration request".
The response stated "We've now reviewed your site. When we review a site, we check to see if it is in violation of our Webmaster Guidelines. If we don't find any problems, we'll reconsider our indexing of your site."
Immediately upon receipt of that message, I was able to find the site had been added to Google's index.
-
Was that an acutal written response?
-
I have only filed one reconsideration request this year. It was a 3 week response time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Advice] Dealing with an immense URl structure full of canonicals with Budget & Time constraint
Good day to you Mozers, I have a website that sells a certain product online and, once bought, is specifically delivered to a point of sale where the client's car gets serviced. This website has a shop, products and informational pages that are duplicated by the number of physical PoS. The organizational decision was that every PoS were supposed to have their own little site that could be managed and modified. Examples are: Every PoS could have a different price on their product Some of them have services available and some may have fewer, but the content on these service page doesn't change. I get over a million URls that are, supposedly, all treated with canonical tags to their respective main page. The reason I use "supposedly" is because verifying the logic they used behind canonicals is proving to be a headache, but I know and I've seen a lot of these pages using the tag. i.e: https:mysite.com/shop/ <-- https:mysite.com/pointofsale-b/shop https:mysite.com/shop/productA <-- https:mysite.com/pointofsale-b/shop/productA The problem is that I have over a million URl that are crawled, when really I may have less than a tenth of them that have organic trafic potential. Question is:
Intermediate & Advanced SEO | | Charles-O
For products, I know I should tell them to put the URl as close to the root as possible and dynamically change the price according to the PoS the end-user chooses. Or even redirect all shops to the main one and only use that one. I need a short term solution to test/show if it is worth investing in development and correct all these useless duplicate pages. Should I use Robots.txt and block off parts of the site I do not want Google to waste his time on? I am worried about: Indexation, Accessibility and crawl budget being wasted. Thank you in advance,1 -
SEO time
I wanto to be in the top of the google search. I am usiing a lot of SEO tools but... I have done it during one month. Do I have to wait more?
Intermediate & Advanced SEO | | CarlosZambrana0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Meta-description not used at all times
Hi all We are marketing an e-commerce site and seem to have a weird issue. For some reason the clearly specified meta description is not being used in the SERPs. Had a look in the source but all tags seems to be there. The site can be found here:
Intermediate & Advanced SEO | | Resultify
www.bangerhead.se A sample search in Google that uses the wrong info in the SERP:
https://www.google.com/webhp?sourceid=chrome-instant&rlz=1C5CHFA_enSE548SE548&ion=1&espv=2&ie=UTF-8#safe=off&q=bangerhead Any ideas to why this is? Grateful for any inputHave a nice day Fredrik0 -
What referrer is shown in http request when google crawler visit a page?
Is it legit to show different content to http request having different referrer? case a: user view one page of the site with plenty of information about one brand, and click on a link on that page to see a product detail page of that brand, here I don't want to repeat information about the brand itself case b: a user view directly the product detail page clicking on a SERP result, in this case I would like to show him few paragraph about the brand Is it bad? Anyone have experience in doing it? My main concern is google crawler. Should not be considered cloaking because I am not differentiating on user-agent bot-no-bot. But when google is crawling the site which referrer will use? I have no idea, does anyone know? When going from one link to another on the website, is google crawler leaving the referrer empty?
Intermediate & Advanced SEO | | max.favilli0 -
Checking Mobile Site Response Time
What is the best way to check the response time of a mobile site? Can this be done in Google Analytics/Webmasters?
Intermediate & Advanced SEO | | theLotter0 -
Google Disavow Tool - Waste of Time
My humble opinion is that Google's disavow tool.... is a utter waste of your time! My site, http://goo.gl/pdsHs was penalized over a year ago after the SEO we hired used black hat techniques to increase ranking. Ironically, while having visibility, Google itself had become a customer. (I guess the site was pretty high quality, trust worthy and user friendly enough for Google employees to purchase from.) Soon enough the message about detecting unnatural links had shown up on the webmaster tools and as expected, our rankings sank and out of view. For a year we had contacted webmasters, asking them remove links pointing back to us. 90% didn't respond, the other 10% complied). Work on our site continued, adding high quality, highly relevant unique content.
Intermediate & Advanced SEO | | Prime85
Rankings never recovered and neither did our traffic or business….. Earlier this month, we learned about Google’s "link disavow tool" and were excited! We had hoped that following the cleanup instruction, using the “link disavow tool”, we would get a chance at recovery!
We watched Matt Cutts’ video, read the various forums/blogs/topics online that were written about it, and then we felt comfortable enough to use it... We went through our backlink profile, determining which links were either spammy or seemed a result of black hat practices or the links added by a 3rd party possibly interested in our demise and added them to a .txt file. We submitted the file via the disavow tool and followed with another reconsideration request. The result came a couple of weeks later… the same cookie cutter email in the WMT suggesting that there are “unnatural links” to the site. Hope turned to disappointment and frustration. Looks like the big box companies will continue to populate the top 100 results of ANY search, the rest will help Google’s shareholders… If your site has gotten in the algorithm crosshairs, you have a better chance of recovering by changing your URL than messing around with this useless tool.0 -
Is there any correlation to time and search ranking?
Is there any evidence that google acknowledges the time that a site has been online with all other things being equal for search ranking?
Intermediate & Advanced SEO | | casper4340