5th Reconsideration Request, Have i missed anything...
-
Hi Guys,
I wonder if any of you can help me out.I'll be shortly submitting another reconsideration request to Google.I've been working on removing bad / spammy links to our site http://goo.gl/j7OpL over the past 6 months and so far every reconsideration request I have submitted has been knocked back with the following message:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Dear site owner or webmaster of http://goo.gl/j7OpL ,
We received a request from a site owner to reconsider http://goo.gl/j7OpL for compliance with Google's Webmaster Guidelines.
We've reviewed your site and we still see links to your site that violate our quality guidelines .
Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes .
We encourage you to make changes to comply with our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results.
If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.
If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team
----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
I've removed over 70% of all our links - we had some large sitewide links on big sites with exact match anchor text to our main money keyword, I've also removed a large link network that our previous SEO company setup.
Today I have completed an overhaul of all our internal links, near enough every blog post that we added to the site had a link back to the home page with an exact match money keyword.
1 thing that I did notice was when we got hit by the penalty it didn't affect every keyword we target just our main / most competitive keyword, yes some of our other keywords took a dip in rankings but not as much as our main keyword.
When I submit our next reconsideration request I'll also attach a spreadsheet of links that I can't remove either because I can't find any contact details / blocked by whois or I'm just not getting a response when I email them.
If anyone can point out anything else that I have missed or might have missed that would be great.
Thanks,
Scott
-
Ryan's given you a super generous answer! I wanted to add a couple of things:
You mentioned that you will attach a list of links that you couldn't get removed. It may help to go even further. What I usually do is attach a document that contains a copy of each email that I have sent for sites that I was unsuccessful with. And, if I got a negative response back I would include that email as well.
I also include screenshots of every contact form that I have submitted. It may be overkill but from Google's perspective if you just say, "I tried to contact them" that's not enough.
You're probably already doing this, but be super humble in your request and make sure that you tell Google you are committed to following the quality guidelines from this point on. I think part of the reason why Google makes webmasters go through this is because they want to be sure that they understand the gravity of trying to game the system with SEO tactics.
And like Ryan said...be really tough on yourself when it comes to links. I have seen a number of webmasters that say, "NO! That's not an unnatural link! It came from an article that I wrote", or something like that. But in reality almost every link that you have had a hand in creating is one that is considered unnatural to Google.
Good luck! If you are successful, it would be great for you to post about your success here in the Q&A to encourage others.
Marie
-
Great answer yet again Ryan.
Thanks for your detailed response.
Thanks,
Scott
-
Hi Scott,
Removing manual penalties for manipulative links is a complex task. The result for most people is to repeatedly have the Reconsideration Request declined. If you tried another 5 times, the results are not likely to change. At a high level there is likely an error in one of three areas:
1. You need to use a comprehensive list of all known backlinks to your site. Using the list from Google is not even close to enough. I use Google WMT + OSE + Raven (Majestic) + AHREFs + SEMrush + Bing. If you do not start with a comprehensive list of links, you will continue to miss addressing manipulative links and Google will not even pay any attention to your Reconsideration Request.
2. You need to ensure your idea of a manipulative link is calibrated with Google. The process begins with being intimately familiar with Google's Guidelines. A few questions to ask for each link:
-
if search engines did not exist, would this link be here?
-
who created the link / content? If the link was created by the site owner, it would likely be considered manipulative
-
how credible is the site? the web page? the content? is it focused on a specific topic or a grab bag?
-
what value does this link / page offer to users?
The above list is not comprehensive, and there are other factors to weigh. There are corner cases as well. What I can share is the PA and DA of the pages involved should not be given any consideration at all. Additionally, there is not any automated tool which can be used for making an organic vs manipulative link determination. I have reviewed several and, to put it nicely, they seem to offer completely false hope to desperate site owners.
3. You need to make a solid, good-faith effort to contact linking sites to request the links be removed. Do not simply change anchor text as that does not make the link any less manipulative. Don't give up simply because the WHOIS e-mail is not valid. Try the WHOIS e-mail, the site e-mail and the Contact Form (if any) on the site. If a site owner denies your link removal request the first time, respond to them in a very polite manner and ask in a different way.
I have been involved with the Reconsideration Request for numerous clients in your situation. Items 1 & 2 are the most common issues and they are show stoppers.
Good Luck.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If we migrate the URLs from HTTP to HTTPS, Do I need to request again an inclusion in Google News?
Hi, If we migrate the URLs from HTTP to HTTPS, Do I need to request again an inclusion in Google News? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Requesting New Custom URL for Google+ Local Business Page
This question is about the new custom URLs for Google+ Local Business pages: Has anyone heard any success stories with requesting a custom URL different than the two reserved ones offered by Google via contacting a Google Rep by phone? And what advantages might there be for a local business to go with a very long custom URL such asgoogle.com/+rosenbergAndDalgrenLLPFortLauderdale as opposed to just google.com/+RdLawyers? Does having the city name in the URL offer any <acronym title="Search Engine Optimization">SEO</acronym> benefit? Thanks!
Intermediate & Advanced SEO | | gbkevin0 -
Could this work for Google Reconsideration Request?
One of my websites has received the following message: We've reviewed your site and we still see links to your site that violate our quality guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. We encourage you to make changes to comply with our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results. If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request. I have used LinkResearchTools DTOX to locate unnatural links and remove them. So far I've been able to remove or nofollow 50/350 and that's as far as I can ever go. The rest of the websites either don't respond or don't have any contact information. I added another 300 suspicious websites to my list and I'll try to get the links manually removed. Hopefully I can get 100/650 websites (and a bit more links) removed in total - at most. That is my estimate. I've been thinking to use Google Disavow Tool for the rest and make sure to submit a nicely written report with spreadsheets to Google - when I get to the reconsideration point. What are your thoughts on this?
Intermediate & Advanced SEO | | zorsto0 -
Manual reconsideration request not going away.
Hello fellow Mozzers. I am in need of the support from a knowledgeable community, my brain is hurting over this query and is not providing any answers! So I have got my fingers crossed that someone can spot the issue of why a website I am responsible for has been bumped out of the Google search results. In March this website (www.message me for details.com) lost all keyword rankings and also all brand terms. Action was taken to remove unnatural links as you can see from the timeline below these links have been removed. The manual review request has come back from Google and now seems to indicate the reason for the serps removal is due to 'some or all of your pages still violate our quality guidelines', which makes me think it the website itself as well and links that were causing the issue. So what has happened so far? 9 March - Google Webmaster Tools notice of detected unnatural links to www.message me for details.com 20 May - All 'unnatural' links that could not be removed by contacting website owners were compiled and added to the disavow tool. 29 May - Manual resubmission request submitted. 6 June - The following message received (see end of post): 18 June - updated disavow request submitted - roughly 40-50% links removed. 27 June – Manual review requested. 2 July - The following message received (see end of post): So after reviewing thousands of links and removing any poor-quality links, contacting webmaster and when not ale to remove manually I have added to a disavow list. Although their is a chance I have missed something in the link reviews but I am pretty confident that anything that could be considered spammy as been removed or disavowed. I have also used the tech crawl tool and there are no issues showing up there. I am at a lose as to what is cauing this issue. I need some advice on what steps to take next.. Regards, Colin Google message Dear site owner or webmaster of www.message me for details.com,
Intermediate & Advanced SEO | | TeamSEO
We received a request from a site owner to reconsider http://www.havenpower.com/ for compliance with Google's Webmaster Guidelines.
We've reviewed your site and we believe that some or all of your pages still violate our quality guidelines. In order to preserve the quality of our search engine, pages from http://www.havenpower.com/ may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines. If you wish to be reconsidered again, please correct or remove all pages that are outside our quality guidelines. When such changes have been made, please visithttps://www.google.com/webmasters/tools/reconsideration?hl=en and resubmit your site for reconsideration. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support.0 -
Novice Question - Can Browsers realistically distinguish words within concatenated strings e.g. text55fun or should one use text-55-fun? What about foreign languages especially more obscure ones like Finnish which Google Translate often miss-translates?
I am attempting to understand what is realistically possible within Google, Yahoo and Bing as they search websites for KeyWords. Technically my understanding is that they should be able to distinguish common words within concatenated strings, although there can be confusion between word boundaries when ambiguity is involved. So in the simple example of text55fun, do search engines actually distinguish text, 55 and fun separately? There are practical processing, databased and algorithm limitations that might turn a technically possible solution into a unrealistic one at a commercial scale. What about more ambiguous strings like stringsstrummingstrongly would that be parsed as string s strummings trongly or strings strummings trongly or strings strumming strongly? Does one need to use dashes or underscores to make it unambiguous to the search engine? My guess is that the engine would recognize the dash or space and better understand the word boundaries yet ignore the dash or underscore from an overall concatenated string perspective. Thanks in advance to whoever can provide any insight to an old coder who is new to this field.
Intermediate & Advanced SEO | | ny600 -
Server requests: 302 followed by a 200
Hi, On an IIS system clicking a particular link the following response codes are returned: GET /nl/nl/process?Someparameter1=1&Someparameter2=2 302 found GET /nl/nl/SomeOtherPage.cms 200 OK What concerns me, besides the obvious 302 and the cAmeLcAse canonical issues is the 200 response without a redirect.
Intermediate & Advanced SEO | | Muffin
What page will then be indexed, ranked and what effect does this have on the pagerank flow, if the 302 was to be changed into a 301?
Also would extention .cms be an issue? Thanks for any answers. Edit. I contacted the developer. He says it's a rewrite, not a meta redirect.
I still think, this rewrite is an issue? Canonical maybe?0 -
Missing Title Tags on Include Files?
GWT is telling me 3 of my include files (Contact Form - Header - Footer) are missing a Title Tag. This has never happened to me before and don't know how to tackle it. On the other hand the warning refers to a subdirectory of my site to these respective include files… The main directory, with literally the same html structure and no Title, returns no errors. Any ideas as to why this error now? or how to fix it? Thanks,
Intermediate & Advanced SEO | | dhidalgo10