Is Google's reinclusion request process flawed?
-
We have been having a bit of a nightmare with a Google penalty (please see http://www.browsermedia.co.uk/2012/04/25/negative-seo-or-google-just-getting-it-painfully-wrong/ or http://econsultancy.com/uk/blog/10093-why-google-needs-to-be-less-kafkaesque for background information - any thoughts on why we have been penalised would be very, very welcome!) which has highlighted a slightly alarming aspect of Google's reinclusion process.
As far as I can see (using Google Analytics), supporting material prepared as part of a reinclusion request is basically ignored. I have just written an open letter to the search quality team at http://www.browsermedia.co.uk/2012/06/19/dear-matt-cutts/ which gives more detail but the short story is that the supporting evidence that we prepared as part of a request was NOT viewed by anyone at Google.
Has anyone monitored this before and experienced the same thing? Does anyone have any suggestions regarding how to navigate the treacherous waters of resolving a penalty?
This no doubt sounds like a sob story for us, but I do think that this is a potentially big issue and one that I would love to explore more.
If anyone could contribute from the search quality team, we would love to hear your thoughts!
Cheers,
Joe
-
Thank you for your thoughts.
I agree that they must be swamped and most of the 'complaints' you can see on the Google forums fully deserve to be penalised in my humble opinion, but I think that the total lack of communication is more damaging than helpful.
If they want to improve the web, why do they not give more details about what is causing the problem? By being more transparent and helping webmasters to eradicate spammy techniques, everyone will be forced into improving their sites for all the right reasons.
If they don't have the resource to handle the reinclusion requests, then they shouldn't have it as an option.
I still feel that it is very poor not to even look at the files that were prepared - that shows a lack of respect.
I agree that it is likely to be something simple. The 'spike' theory is still the strongest contender for me, due to the timings of events, but that is alarming if it proves to be true as we were effectively penalised for doing exactly what Google encourages (creating good content that will naturally attract links).
Another possible cause is the fact that we have got a number of directory links over the years. Whilst I have never considered these to be high quality links, I have never seen Google saying that you shouldn't submit your site to them (indeed, they used to actively suggest that submitting to Yahoo! was a good idea) and it is a way for Google to outsource some human assessment of sites (assuming that the directories do check your sites).
If it is the directories, then the door for negative SEO is so wide open that it is alarming. As many have said, completely ignoring such links would be better than penalising you.
We are still no closer to understanding what we have done wrong, despite every effort to adhere to the guidelines and a lot of work trying to audit / document our link profile. With very little faith in the reinclusion process, where can we possibly turn to now?
We will see. There were multiple views of the open letter from Google, so somebody somewhere has seen it and I just hope that there is some form of response.
The irony is that we spend most of our life defending Google and encouraging clients to improve what they are doing online. On this occasion, I really find it hard to defend them. I appreciate that we are a drop in a mighty ocean, but the principle is one that I think is an important one and one that I will pursue.
Thanks again for your contribution,
Joe
-
Not sticking up for them but you have to appreciate the amount of people that would probably try to send them all sorts of viruses anyway they can, also they probably don;t have or want to take the time looking at everyone so closely, basically they will just see if the offending stuff is still there, if it is they won't give you any love.
Honestly, it's probably something so simple you have overlooked it. Keyword ratio's above 6% can be taken as spam depending on content amount, non relevent link coming in/going out, links from already penalzed sites, anything unnatural is now what google has been focusing on, not just backlinks, so go through your site, obviously the HP is the first point to start at, if you are really sure there is nothing going on in the way of spaminess or over optimization, comb through your other pages. It will usually be a problem one your best ranked pages. Or, ex best ranked pages if you have been hit with a penguin slap.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is the image property really required for Google's breadcrumbs structured data type?
In its structured data (i.e., Schema.org) documentation, Google says that the "image" property is required for the breadcrumbs data type. That seems new to me, and it seems unnecessary for breadcrumbs. Does anyone think this really matters to Google? More info about breadcrumbs data type:
Intermediate & Advanced SEO | | Ryan-Ricketts
https://developers.google.com/search/docs/data-types/breadcrumbs I asked Google directly here:
https://twitter.com/RyanRicketts/status/7554782668788531220 -
Big discrepancies between pages in Google's index and pages in sitemap
Hi, I'm noticing a huge difference in the number of pages in Googles index (using 'site:' search) versus the number of pages indexed by Google in Webmaster tools. (ie 20,600 in 'site:' search vs 5,100 submitted via the dynamic sitemap.) Anyone know possible causes for this and how i can fix? It's an ecommerce site but i can't see any issues with duplicate content - they employ a very good canonical tag strategy. Could it be that Google has decided to ignore the canonical tag? Any help appreciated, Karen
Intermediate & Advanced SEO | | Digirank0 -
How can I get a list of every url of a site in Google's index?
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
Should we use URL parameters or plain URL's=
Hi, Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site. Let's say we are creating a AirBNB clone, and we want to be found when people search for apartments new york. As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so clone.com/Appartments/New-York but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft? clone.com/Apartments/New-York?price=30&size=100 or (We are using Node.js so no problem) clone.com/Apartments/New-York/Price/30/Size/100 The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google. I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter. We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?
Intermediate & Advanced SEO | | Peekabo0 -
Silo Architecture - need an expert's advice
I understand the concept of silo architecture. What I don't understand is how to build the site navigation. I see experts talking about silos, but their sites have pervasive top level navigation. In theory, your top level nav breaks your silos. If I have 20 pages of supporting content all linked to my silo page, and the top nav is on the supporting content pages, then those pages all link to the pages in the top nav - silo broken, and link juice diluted. it would seem to me that the only way to build a true silo is to strip out all of the navigation on a supporting page, and only have it link to: 1. The silo landing page 2. Other supporting pages in the silo. is this what Bruce Clay does? I've seen Rand's lectures on silos as well. Is this what he is doing? I recently saw a video by the Network Empire team, and they'd also have a pervasive nav. Can someone please explain this?
Intermediate & Advanced SEO | | CsmBill0 -
We're indexed in Google News, any tips or suggestions for getting traffic from news?
We have a news sitemap, and follow all best practices as outlined by Google for news. We are covering breaking stories at the same time as other publications, but have only made it to the front page of Google News once in the last few weeks. Does anyone have any tips, recommended reading, etc for how to get to the front page of Google News? Thanks!
Intermediate & Advanced SEO | | nicole.healthline0 -
Google suddenly indexing and displaying URLs that haven't existed for years?
We recently noticed google is showing approx 23,000 indexed .jsp urls for our site. These are ancient pages that haven't existed in years and have long been 301 redirected to valid urls. I'm talking 6 years. Checking the serps the other day (and our current SEOMoz pro campaign), I see that a few of these urls are now replacing our correct ones in the serps for important, competitive phrases. What the heck is going on here? Is Google suddenly ignoring rewrite rules and redirects? Here's an example of the rewrite rules that we've used for 6+ years: RewriteRule ^(.*)/xref_interlux_antifoulingoutboards&keels.jsp$ $1/userportal/search_subCategory.do?categoryName=Bottom%20Paint&categoryId=35&refine=1&page=GRID [R=301] Now, this 'bottom paint' url has been incredibly stable in the serps for over a half decade. All of a sudden, a google search for 'bottom paint' (no quotes) brings up the jsp page at position 2-3. This is just one example of something very bizarre happening. Has anyone else had something similar happen lately? Thank You <colgroup><col width="64"></colgroup>
Intermediate & Advanced SEO | | jamestown
| RewriteRule ^(.*)/xref_interlux_antifoulingoutboards&keels.jsp$ $1/userportal/search_subCategory.do?categoryName=Bottom%20Paint&categoryId=35&refine=1&page=GRID [R=301] |0 -
Will blocking google and SE's from indexing images hurt SEO?
Hi, We have a bit of a problem where on a website we are managing, there are thousands of "Dynamically" re-sized images. These are stressing out the server as on any page there could be upto 100 dynamically re-sized images. Google alone is indexing 50,000 pages a day, so multiply that by the number of images and it is a huge drag on the server. I was wondering if it maybe an idea to blog Robots (in robots.txt) from indexing all the images in the image file, to reduce the server load until we have a proper fix in place. We don't get any real value from having our website images in "Google Images" so I am wondering if this could be a safe way of reducing server load? Are there any other potential SEO issues this could cause?? Thanks
Intermediate & Advanced SEO | | James770