5th Reconsideration Request, Have i missed anything...
-
Hi Guys,
I wonder if any of you can help me out.I'll be shortly submitting another reconsideration request to Google.I've been working on removing bad / spammy links to our site http://goo.gl/j7OpL over the past 6 months and so far every reconsideration request I have submitted has been knocked back with the following message:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Dear site owner or webmaster of http://goo.gl/j7OpL ,
We received a request from a site owner to reconsider http://goo.gl/j7OpL for compliance with Google's Webmaster Guidelines.
We've reviewed your site and we still see links to your site that violate our quality guidelines .
Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes .
We encourage you to make changes to comply with our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results.
If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.
If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I've removed over 70% of all our links - we had some large sitewide links on big sites with exact match anchor text to our main money keyword, I've also removed a large link network that our previous SEO company setup.
Today I have completed an overhaul of all our internal links, near enough every blog post that we added to the site had a link back to the home page with an exact match money keyword.
1 thing that I did notice was when we got hit by the penalty it didn't affect every keyword we target just our main / most competitive keyword, yes some of our other keywords took a dip in rankings but not as much as our main keyword.
When I submit our next reconsideration request I'll also attach a spreadsheet of links that I can't remove either because I can't find any contact details / blocked by whois or I'm just not getting a response when I email them.
If anyone can point out anything else that I have missed or might have missed that would be great.
Thanks,
Scott
-
Ryan's given you a super generous answer! I wanted to add a couple of things:
You mentioned that you will attach a list of links that you couldn't get removed. It may help to go even further. What I usually do is attach a document that contains a copy of each email that I have sent for sites that I was unsuccessful with. And, if I got a negative response back I would include that email as well.
I also include screenshots of every contact form that I have submitted. It may be overkill but from Google's perspective if you just say, "I tried to contact them" that's not enough.
You're probably already doing this, but be super humble in your request and make sure that you tell Google you are committed to following the quality guidelines from this point on. I think part of the reason why Google makes webmasters go through this is because they want to be sure that they understand the gravity of trying to game the system with SEO tactics.
And like Ryan said...be really tough on yourself when it comes to links. I have seen a number of webmasters that say, "NO! That's not an unnatural link! It came from an article that I wrote", or something like that. But in reality almost every link that you have had a hand in creating is one that is considered unnatural to Google.
Good luck! If you are successful, it would be great for you to post about your success here in the Q&A to encourage others.
Marie
-
Great answer yet again Ryan.
Thanks for your detailed response.
Thanks,
Scott
-
Hi Scott,
Removing manual penalties for manipulative links is a complex task. The result for most people is to repeatedly have the Reconsideration Request declined. If you tried another 5 times, the results are not likely to change. At a high level there is likely an error in one of three areas:
1. You need to use a comprehensive list of all known backlinks to your site. Using the list from Google is not even close to enough. I use Google WMT + OSE + Raven (Majestic) + AHREFs + SEMrush + Bing. If you do not start with a comprehensive list of links, you will continue to miss addressing manipulative links and Google will not even pay any attention to your Reconsideration Request.
2. You need to ensure your idea of a manipulative link is calibrated with Google. The process begins with being intimately familiar with Google's Guidelines. A few questions to ask for each link:
-
if search engines did not exist, would this link be here?
-
who created the link / content? If the link was created by the site owner, it would likely be considered manipulative
-
how credible is the site? the web page? the content? is it focused on a specific topic or a grab bag?
-
what value does this link / page offer to users?
The above list is not comprehensive, and there are other factors to weigh. There are corner cases as well. What I can share is the PA and DA of the pages involved should not be given any consideration at all. Additionally, there is not any automated tool which can be used for making an organic vs manipulative link determination. I have reviewed several and, to put it nicely, they seem to offer completely false hope to desperate site owners.
3. You need to make a solid, good-faith effort to contact linking sites to request the links be removed. Do not simply change anchor text as that does not make the link any less manipulative. Don't give up simply because the WHOIS e-mail is not valid. Try the WHOIS e-mail, the site e-mail and the Contact Form (if any) on the site. If a site owner denies your link removal request the first time, respond to them in a very polite manner and ask in a different way.
I have been involved with the Reconsideration Request for numerous clients in your situation. Items 1 & 2 are the most common issues and they are show stoppers.
Good Luck.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there anything I need to worry about if... We show/hide header navigation based upon visit from external traffic?
Scenario: So imagine if LinkedIn turned off their main navigation/header if you landed on your personal profile via a search engine or via an external link. But if you were on LinkedIn when you found it, the navigation remains the same.
Intermediate & Advanced SEO | | mysitesrock0 -
AJAX requests and implication for SEO
Hi, I got a question in regard to webpages being served via AJAX request as I couldn't find a definitive answer in regard to an issue we currently face: When visitors on our site select a facet on a Listing Page, the site doesn't fully reload. As a consequence only certain tags of the content (H1, description,..) are updated, while other tags like canonical URLs, meta noindex,nofollow tag, or the title tag are not updating as long as you don't refresh the page. We have no information about how this will be crawled and indexed yet but I was wondering if anyone of you knows, how this will impact SEO?
Intermediate & Advanced SEO | | FashionLux0 -
DeepCrawl Calls Incomplete Open Graph Tags and Missing Twitter Cards An Issue. How important is this?
Hi, Let me first say that I really like the tool DeepCrawl. So, not busting on them. More like I'm interested in the relative importance of two items they call as "Issues." Those items are "Incomplete Open Graph Tags" and "No Valid Twitter Cards." They call this out on every page. To define it a bit further, I'm interested in the importance as it relates to organic search.I'm also interested in there's some basic functionality we may have missed in our Share42 implementation. To me, it looks like the social sharing buttons work. Also, we use Share42 social sharing buttons, which are quite functional. If it would help, I could private message you an example url. Thanks! Best... Mike
Intermediate & Advanced SEO | | 945011 -
Manual Removal Request Versus Automated Request to Remove Bad Links
Our site has several hundred toxic links. We would prefer that the webmaster remove them rather than submitting a disavow file to Google. Are we better off writing web masters over and over again to get the links removed? If someone is monitoring the removal and keeps writing the web masters will this ultimately get better results than using some automated program like LinkDetox to process the requests? Or is this the type of request that will be ignored no matter what we do and how we ask? I am willing to invest in the manual labor, but only if there is some chance of a favorable outcome. Does anyone have experience with this? Basically how to get the highest compliance rate for link removal requests? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan11 -
What makes a site appear in Google Alerts? And does it mean anything?
Hi All, I recently started using Google Alerts more and more and while sites I support never appear there (not surprising) I recently noticed few very poor and low quality sites that do. This site for example appears quite a bit in its niche. So to my questions... What makes a site appear in Google Alerts? And does it mean anything? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Our Robots.txt and Reconsideration Request Journey and Success
We have asked a few questions related to this process on Moz and wanted to give a breakdown of our journey as it will likely be helpful to others! A couple of months ago, we updated our robots.txt file with several pages that we did not want to be indexed. At the time, we weren't checking WMT as regularly as we should have been and in a few weeks, we found that apparently one of the robots.txt files we were blocking was a dynamic file that led to the blocking of over 950,000 of our pages according to webmaster tools. Which page was causing this is still a mystery, but we quickly removed all of the entries. From research, most people say that things normalize in a few weeks, so we waited. A few weeks passed and things did not normalize. We searched, we asked and the number of "blocked" pages in WMT which had increased at a rate of a few hundred thousand a week were decreasing at a rate of a thousand a week. At this rate it would be a year or more before the pages were unblocked. This did not change. Two months later and we were still at 840,000 pages blocked. We posted on the Google Webmaster Forum and one of the mods there said that it would just take a long time to normalize. Very frustrating indeed considering how quickly the pages had been blocked. We found a few places on the interwebs that suggested that if you have an issue/mistake with robots.txt that you can submit a reconsideration request. This seemed to be our only hope. So, we put together a detailed reconsideration request asking for help with our blocked pages issue. A few days later, to our horror, we did not get a message offering help with our robots.txt problem. Instead, we received a message saying that we had received a penalty for inbound links that violate Google's terms of use. Major backfire. We used an SEO company years ago that posted a hundred or so blog posts for us. To our knowledge, the links didn't even exist anymore. They did.... So, we signed up for an account with removeem.com. We quickly found many of the links posted by the SEO firm as they were easily recognizable via the anchor text. We began the process of using removem to contact the owners of the blogs. To our surprise, we got a number of removals right away! Others we had to contact another time and many did not respond at all. Those we could not find an email for, we tried posting comments on the blog. Once we felt we had removed as many as possible, we added the rest to a disavow list and uploaded it using the disavow tool in WMT. Then we waited... A few days later, we already had a response. DENIED. In our request, we specifically asked that if the request were to be denied that Google provide some example links. When they denied our request, they sent us an email and including a sample link. It was an interesting example. We actually already had this blog in removem. The issue in this case was, our version was a domain name, i.e. www.domainname.com and the version google had was a wordpress sub domain, i.e. www.subdomain.wordpress.com. So, we went back to the drawing board. This time we signed up for majestic SEO and tied it in with removem. That added a few more links. We also had records from the old SEO company we were able to go through and locate a number of new links. We repeated the previous process, contacting site owners and keeping track of our progress. We also went through the "sample links" in WMT as best as we could (we have a lot of them) to try to pinpoint any other potentials. We removed what we could and again, disavowed the rest. A few days later, we had a message in WMT. DENIED AGAIN! This time it was very discouraging as it just didn't seem there were any more links to remove. The difference this time, was that there was NOT an email from Google. Only a message in WMT. So, while we didn't know if we would receive a response, we responded to the original email asking for more example links, so we could better understand what the issue was. Several days passed we received an email back saying that THE PENALTY HAD BEEN LIFTED! This was of course very good news and it appeared that our email to Google was reviewed and received well. So, the final hurdle was the reason that we originally contacted Google. Our robots.txt issue. We did not receive any information from Google related to the robots.txt issue we originally filed the reconsideration request for. We didn't know if it had just been ignored, or if there was something that might be done about it. So, as a last ditch final effort, we responded to the email once again and requested help as we did the other times with the robots.txt issue. The weekend passed and on Monday we checked WMT again. The number of blocked pages had dropped over the weekend from 840,000 to 440,000! Success! We are still waiting and hoping that number will continue downward back to zero. So, some thoughts: 1. Was our site manually penalized from the beginning, yet without a message in WMT? Or, when we filed the reconsideration request, did the reviewer take a closer look at our site, see the old paid links and add the penalty at that time? If the latter is the case then... 2. Did our reconsideration request backfire? Or, was it ultimately for the best? 3. When asking for reconsideration, make your requests known? If you want example links, ask for them. It never hurts to ask! If you want to be connected with Google via email, ask to be! 4. If you receive an email from Google, don't be afraid to respond to it. I wouldn't over do this or spam them. Keep it to the bare minimum and don't pester them, but if you have something pertinent to say that you have not already said, then don't be afraid to ask. Hopefully our journey might help others who have similar issues and feel free to ask any further questions. Thanks for reading! TheCraig
Intermediate & Advanced SEO | | TheCraig5 -
Novice Question - Can Browsers realistically distinguish words within concatenated strings e.g. text55fun or should one use text-55-fun? What about foreign languages especially more obscure ones like Finnish which Google Translate often miss-translates?
I am attempting to understand what is realistically possible within Google, Yahoo and Bing as they search websites for KeyWords. Technically my understanding is that they should be able to distinguish common words within concatenated strings, although there can be confusion between word boundaries when ambiguity is involved. So in the simple example of text55fun, do search engines actually distinguish text, 55 and fun separately? There are practical processing, databased and algorithm limitations that might turn a technically possible solution into a unrealistic one at a commercial scale. What about more ambiguous strings like stringsstrummingstrongly would that be parsed as string s strummings trongly or strings strummings trongly or strings strumming strongly? Does one need to use dashes or underscores to make it unambiguous to the search engine? My guess is that the engine would recognize the dash or space and better understand the word boundaries yet ignore the dash or underscore from an overall concatenated string perspective. Thanks in advance to whoever can provide any insight to an old coder who is new to this field.
Intermediate & Advanced SEO | | ny600 -
Do you have to wait after disavowing before submitting a reconsideration request
Hi all We have a link penalty at the moment it seems. I went through 40k links in various phases and have disavowed over a thousand domains that date back to old SEO work. I was barely able to have any links removed as the majority are on directories etc that no one looks after any more etc and / or which are spammy and scraped anyway. According to link research tools link detox tool, we now have a very low risk profile (I loaded the disavowed links into the tool for it to take into consideration when assessing our profile). I then submitted a reconsideration request on the same day as loading the new disavowed file (on the 26th of April). However today (7th May) we got a message in webmaster central that says our link profile is still unnatural. Aaargh. My question: is the disavow file taken into consideration when the reconsideration request is reviewed (ie is that information immediately available to the reviewer)? Or do we have to wait for the disavow file to flow through in the crawl stats? If so, how long do we have to wait? I've checked a link that I disavowed last time and it's still showing up in the links that I pull down from Webmaster Central, and indeed links that I disavowed at the start of April are still showing up in the list of links that can be downloaded. Any help gratefully received. I'm pulling my hair out here, trying to undo the dodgy work of a few random people many months ago! Cheers, Will
Intermediate & Advanced SEO | | ArenaFlowers.com0