Google Reconsideration Request - Most Efficient Process
-
Hi,
I'm working on a Google reconsideration request for a site with a longstanding penalty.
Here's what I did:
Round 1
- Downloaded a CSV of all the domains and all the pages linking to the site. Went through the lot manually and sorted them into three types: Disavow Domain, Disavow Page, Keep
- All low-quality domains were disavowed, all pages from places like blogspot with low-quality links on certain blogs were disavowed. Submitted disavow file, then sent a detailed reconsideration request including a link to the disavow file.
Reconsideration request was not successful. Google gave two examples of links I should remove, bizarrely the examples they gave were already disavowed, which seemd a bit odd. So I took this to mean Google Webmaster Tools and disavow files were in themselves not enough. The links I kept were largely from PRWeb syndication which seems legit.
Round 2
Here's what I'm doing now. Any ideas for how the below process can be improved to get the maximum chance of a successful request, please let me know.- Get all linking pages from Webmaster Tools as before and also MajesticSEO's Historic Index. This gave me around three times more domains to remove. The additionnal domains from Majestic that weren't in Webmaster tools I just put them all in the disavow file.
- Conduct a manual link removal email campaign. I've got around 2500 domains to go through, so how can I best do this. My process at the moment is:
- Use software to get email addresses from whois records
- send them an email
- make a spreadsheet of responses
- include link to spreadsheet in Google Docs as well as link to new disavow file
Should I research each site manually to get email addresses? It does seem rather a waste of an offshorer's time, from what I've seen some people use offshorers and others have used software tools successfully. The other thing is sending the emails, how can I do this? Any smtp email campaign site won't let me use their service because the emails are not opt-in, they classify it as spam. Does anyone know a solution to send 2500 emails legitimately from a webmail account for example? I'm having to send bulk emails to get rid of spam links.
Finally most of the offending links have keyword anchor text from spun articles, I've deleted all the sites except EzineArticles. Would you delete this too, it's an awful site but client is hung up on it. ExineArticle links may have some value, on the other hand it's more of the same keyword-rich anchor text articles. Keep or disavow the individual pages?
Finally, anything else I've missed? Anything to add? Thanks for all your help
-
I personally do everything manually. I think that the link removal tools can work great for some sites, but your best chance at identifying the bad links and keeping the good ones is to look at them manually. 2500 domains is a lot, but not impossible. I'm currently working on an account of about that size and it will take me about 10-14 days to go through as many. Once you get going you will recognize patterns and it will go faster.
I used to get emails on my own but I have just hired someone to do this for me. I find that the automated tools miss a lot of them. I was considering hiring from o-desk or mechanical turk, but in my situation, because my business is expanding and most of what I do is penalty removal, it's worth my while to hire and train someone to do this for me.
btw...if you've got 2500 domains, you won't have 2500 emails. Many will be offline or nofollowed or perhaps even natural.
Ezine Articles links definitely need to be removed if they are followed links. Often times those links are nofollowed, but if you have a high enough account level there then they are followed and need to go.
A few other points:
-Yes you're right. It's not enough to just disavow. Google's going to want to see evidence that you've tried hard to remove links.
-Lately I have only be using links from WMT and not other sources like Majestic and ahrefs. That may cut down on the number of domains you have to deal with. So far it is working for me.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disavow File Submission process?
Hi Mozzers, I am working for client that hasn't got penalized but has lots of junk seo directories that I would like to disavow. My question is should i try reaching out to webmasters (if they are existant) first and show proof to google? or should I just go ahead and submit the file without any reach out?will it still work? Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
When does Google index a fetched page?
I have seen where it will index on of my pages within 5 minutes of fetching, but have also read that it can take a day. I'm on day #2 and it appears that it has still not re-indexed 15 pages that I fetched. I changed the meta-description in all of them, and added content to nearly all of them, but none of those changes are showing when I do a site:www.site/page I'm trying to test changes in this manner, so it is important for me to know WHEN a fetched page has been indexed, or at least IF it has. How can I tell what is going on?
Intermediate & Advanced SEO | | friendoffood0 -
Hide Aggregation from Google?
Google isn't a fan of aggregation, but sometimes it is a good way to fill out content when you cannot cover every news story there is. What I'm wondering is if anyone has continued to do any form of aggregation based on a category and hide that url from Google. Example: example.com/industry-news/ -- is where you'd post aggregation stories but you block robots from crawling that. We wouldn't be doing this for search value just value to our readers. Thoughts?
Intermediate & Advanced SEO | | meistermedia0 -
Google Local Places and Organic Listing?
Hi All, Is it possible to have visibility in Google local places as well first page in Google for same set of keywords?
Intermediate & Advanced SEO | | RuchiPardal0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Is google rolling out a huge update this week?
I am seeing some huge shifts in SERPS at the moment, for some keywords such as web design. Nearly every single result on the homepage is a different company than 2 weeks ago. We are seeing some clients have huge jumps in ranks but also some are dropping. Seems like we could have a big Panda/Penguin like update rolling out.
Intermediate & Advanced SEO | | tempowebdesign1 -
Dropped Out of Google and Bing
I am helping with a site that at one time I had on page 1 for Google/Bing. Site started to slip in rankings, then someone else did a makeover of the store and botched things by renaming pages, having errors in pages (multiple head/body), mismatch page names from sitemap, etc. Site slipped to page 4/5. I righted things, fixed duplication using canonicalization, made some other changes. Now site is gone completely from Google/Bing for desired keyword. No penalties. Site still shows if do search on domain name. Site is www.plussizeplum.com (plus size lingerie, sorry), keyword target is plus size lingerie. Anyone have any clues, tips, etc on why we fell off the face of the earth? Page Authority/Domain Authority are both comparable to most of the page 1/2 sites for same thing. Thanks for any advice.
Intermediate & Advanced SEO | | dlcohen0 -
How to Block Google Preview?
Hi, Our site is very good for Javascript-On users, however many pages are loaded via AJAX and are inaccessible with JS-off. I'm looking to make this content available with JS-off so Search Engines can access them, however we don't have the Dev time to make them 'pretty' for JS-off users. The idea is to make them accessible with JS-off, but when requested by a user with JS-on the user is forwarded to the 'pretty' AJAX version. The content (text, images, links, videos etc) is exactly the same but it's an enormous amount of effort to make the JS-off version 'pretty' and I can't justify the development time to do this. The problem is that Googlebot will index this page and show a preview of the ugly JS-off page in the preview on their results - which isn't good for the brand. Is there a way or meta code that can be used to stop the preview but still have it cached? My current options are to use the meta noarchive or "Cache-Control" content="no-cache" to ask Google to stop caching the page completely, but wanted to know if there was a better way of doing this? Any ideas guys and girls? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0