Manual Penalty Reconsideration Request Help
-
Hi All,
I'm currently in the process of creating a reconsideration request for an 'Impact Links' manual penalty.
So far I have downloaded all LIVE backlinks from multiple sources and audited them into groups;
-
Domains that I'm keeping (good quality, natural links).
-
Domains that I'm changing to No Follow (relevant good quality links that are good for the user but may be affiliated with my company, therefore changing the links to no follow rather than removing).
-
Domains that I'm getting rid of. (poor quality sites with optimised anchor text, directories, articles sites etc.).
One of my next steps is to review every historical back link to my website that is NO LONGER LIVE. To be thorough, I have planned to go through every domain (even if its no longer linking to my site) that has previously linked and straight up disavow the domain (if its poor quality).But I want to first check whether this is completely necessary for a successful reconsideration request?
My concerns are that its extremely time consuming (as I'm going through the domains to avoid disavowing a good quality domain that might link back to me in future and also because the historical list is the largest list of them all!) and there is also some risk involved as some good domains might get caught in the disavowing crossfire, therefore I only really want to carry this out if its completely necessary for the success of the reconsideration request. Obviously I understand that reconsideration requests are meant to be time consuming as I'm repenting against previous SEO sin (and believe me I've already spent weeks getting to the stage I'm at right now)... But as an in house Digital Marketer with many other digital avenues to look after for my company too, I can't justify spending such a long time on something if its not 100% necessary.
So overall - with a manual penalty request, would you bother sifting through domains that either don't exist anymore or no longer link to your site and disavow them for a thorough reconsideration request? Is this a necessary requirement to revoke the penalty or is Google only interested in links that are currently or recently live?
All responses, thoughts and ideas are appreciated
Kind Regards
Sam
-
-
Thanks again for your response Gary.
With regards to how many reffering domains and backlinks, it depends on how much i trust various bits of software (eg. Majestic SEO) when they tell me if the link is live or not.
In total there's about 3,200 referring domains historically with over 350,000 backlinks (lots of spam). Looking at whats live today, thats about 600 domains and 30,000 backlinks or so.
So far I've audited all links (from whats live) into keeping, changing to no follow or removing. Ive reached out to all no follows successfully and I've justified in depth the list of domains I'm keeping. I'm now in the process of reaching out to the poor quality links (first wave) and have covered about 200 referring domains.
The main question here is just exactly what to do with the rest of the links that majestic and GWT are telling me are no longer live (after checking some examples, there are some live that say they aren't live on majestic). Initially I was just going through them and throwing poor quality ones (even if they no longer link) straight into the disavow file to be safe. But since, I've worked with my developer to create a script to check which of the 2,500 none live domains are still live (and therefore cutting down my time considerably).
So overall, I am confident with my approach on links that are live (as this is the standard approach) and I am being as thorough as is possible. But when I wrote this question initially I was unsure whether I had to deal with the 'none live' domains (mainly because I didn't know whether to fully trust Majestic when its saying that they're not live) and so I wanted to check whether it was something I needed to do because it would be extremely time consuming.
Hopefully you understand where I'm coming from with this?
Sam
-
Thanks for your response Richard.
This is however an extremely generic response to quite a specific question. I didn't ask what a reconsideration request does!
-
So sorry for the delay getting back to you, its been a crazy week and didnt notice the response.
"Note that this is a manual penalty though, so fortunately no waiting for Penguin refreshes."
OK, just to let you know, once they lift the manual penalty, you still need to wait for a Penguin refresh. my penalty was lifted in May 2013 the vast majority of crap links had not been crawled and took a very long time for Google to do so. For the disavow file to take effect it needs to crawl each of those pages with your disavow file in mind and change them to a nofollow. Once a healthy amount is crawled you will then be in good standing when the Penguin algo is run. If Penguin runs before you have an acceptable level of healthiness you will not be released form Penguin and will have to wait for the next. So it took us until Oct 17th 2014 for us to finally get released. This was WITH John Muellers help!
My advice is don't be too picky with what you keep. Go through everything, mine was 20,000 Referring domains with 250k links! We had a 10 year history of business online and at one point also attacked with negative seo. So was a big job
"Providing I've given all possible evidence I can about the links being live or not to Google, do you think that disavowing all poor quality links that APPEAR to be no longer live is good enough in Google's eyes? Obviously for all links that are still live (as far as i can see) I have outreached to at least 3 times and disavowed if I can't get in touch."
Yes, create a report to show the work you have done, whats removed, who you have contacted, who did not respond. I did an Excel spreadsheet, one domain per line, with a few fields like, last contacted, date, removed etc..
There are lots of programmes out there that help with this now. Not so easy when your the first and there are no tools for it!
Also its best to do domain instead of links, how many links do you have pointing to your site?
-
A good reconsideration request does three things:
- Explains the exact quality issue on your site.
- Describes the steps you’ve taken to fix the issue.
- Documents the outcome of your efforts.
-
Actually, I agree with you. What you're describing are sites that look like the link has been deleted, but where the link actually still exists. My answer was regarding sites where the link actually has been deleted and doesn't exist.
-
Thanks for your response Gary.
That does make sense and to be honest is something that worries me! I am putting faith into software here (ie. I haven't gone through every single domain manually and checked that the link is still live) which is telling me whether the link is still live or not. If Google's software tells them otherwise when they review my reconsideration request, then all my other efforts are most likely wasted. I take it from this that you would advise addressing the none active domains too?
Note that this is a manual penalty though, so fortunately no waiting for Penguin refreshes.
Providing I've given all possible evidence I can about the links being live or not to Google, do you think that disavowing all poor quality links that APPEAR to be no longer live is good enough in Google's eyes? Obviously for all links that are still live (as far as i can see) I have outreached to at least 3 times and disavowed if I can't get in touch.
cheers
Sam
-
Sorry I have to disagree,
There are many sites, specifically directory sites that list websites and as more sites get listed they push your link to page 3, 4, 5. It looks like the link does not exist but it does on another page.
Some sites are that are crappy also have poor connections/bandwidth etc... So they go up and down and overload all the time. Just because its down now does not mean its down later when Google crawls it.
When I did my now famous! link clean up these were both issues that came up when I got help from John Mueller at Google.
It sucks because its just a hell of a lot of work, but based on how long it takes for a penguin update to come about, I would make sure you get it right FIRST TIME or you could wait more than a year to see returns.
Feel free to ask me anything.
Best of luck
Gary
-
Yes, I would be very surprised if Google wanted you to do anything with links that no longer exist.
-
Thanks for your response, Adam.
Would you say the same for domains that are still live but no longer contain links to your site?
Thanks
-
No, I would not spend time on links/domains that no longer exist. (I've never heard of that being necessary.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Merging Two Sites: Need Help!
I have two existing e-commerce sites. The older one, is built on the Yahoo platform and had limitations as far as user experience. The new site is built on the Magento 2 platform. We are going to be using SLI search for our search and navigation on the new Magento platform. SLI wants us to 301 all of our categories to the hosted category pages they will create, that will have a URL structure akin to site.com/shop/category-name.html. The issue is: If I want to merge the two sites, I will have to do a 301 to the category pages of the new site, which will have 301s going to the category pages hosted by SLI. I hope this makes sense! The way I see it, I have two options: Do a 301 from the old domain to categories of the new domain, and have the new domain's categories 301 to the SLI categories; or, I can do my 301s directly to the SLI hosted category pages. The downside of #1 is that I will be doing two 301s, and I know I will lose more link juice as a result. The upside of #1, is that if decide not to use SLI in the future, it is one less thing to worry about. The downside of #2, is that I will be directing all the category pages from the old site to a site I do not ultimately control. I appreciate any feedback.
Intermediate & Advanced SEO | | KH20171 -
AJAX requests and implication for SEO
Hi, I got a question in regard to webpages being served via AJAX request as I couldn't find a definitive answer in regard to an issue we currently face: When visitors on our site select a facet on a Listing Page, the site doesn't fully reload. As a consequence only certain tags of the content (H1, description,..) are updated, while other tags like canonical URLs, meta noindex,nofollow tag, or the title tag are not updating as long as you don't refresh the page. We have no information about how this will be crawled and indexed yet but I was wondering if anyone of you knows, how this will impact SEO?
Intermediate & Advanced SEO | | FashionLux0 -
Unexplained Drop In Ranking and Traffic-HELP!
I operate a real estate web site in New York City (www.nyc-officespace-leader.com). It was hit by Penguin in April 2012, with search volume falling from 6,800 per month in March 2012 to 3,300 by June 2012. After refreshing content and changing the theme, volume recovered to 4,300 per month in October 2013. There was a big improvement in early October 2013, perhaps tied to a Panda update. In November 2013 I hired an SEO company. They are reputable; on MOZ's recommended list. After following all their suggestions (searching and removing duplicate content, disavowing toxic links, improving the site structure to make it easier for Google to index listings, re-writing ten key landing pages, improving the design of the user interface) ranking and traffic started to decline in April of 2014 and crashed in June 2014 after an upgraded design with improved user interface was launched. Search volume is went from 4700 in March to around 3800 in June. However ranking on the keywords that generate conversions has really declined, and clicks from those terms are down at least 65%. My online business is severely compromised after I have spent almost double the anticipated budget to improve ranking and conversion. A few questions: 1. Could a drop in the number of domains lining to our site have led to this decline? About 30 domains that had toxic links to us agreed to remove them. We had another 70 domains disavowed in late April. We only have 78 domains pointing to our domain now, far less than before (see attached AHREFs image). It seems there is a correlation in the timeline between the number of domains pointing to us and ranking performance. The number of domains pointing to us has never been this low. Could this be causing the drop? My SEO firm believes that the quality of these links are very low and the fact that many are gone is in fact a plus. 2. The number of indexed pages has jumped to 851 from 675 in early June (see attached image from Google Webmaster tools), right after a site upgrade. The number of pages in the site map is around 650. Could the indexation of the extra 175 page somehow have diluted the quality of the site in Google's eyes? We have filed removal request for these pages in Mid June and again last week with Google but they still appear. In 2013 we also launched an upgrade and Google indexed an extra 500 pages (canonical tags were not set up correctly) and search volume and ranking collapsed. Oddly enough when the number of pages indexed by Google fell, ranking improved. I wonder if something similar has occurred. 3. May 2014 Panda update. Many of our URLs are product URLs of listings. They have less than 100 words. Could Google suddenly be penalizing us for that? It is very difficult to write descriptions of hundreds of words for products that change quickly. I would think the Google takes this into account. If someone could present some insight into this issue I would be very, very grateful. I have spent over $25,000 on SEO reports, wireframe design and coding and now find myself in a worse position than when I started. My SEO provider is now requesting that I purchase even more reports for several thousand dollars and I can't afford it, nor can I justify it after such poor results. I wish they would take it upon themselves to identify what went wrong. In any case, if anyone has any suggestions I would really appreciate it. I am very suspicious that this drop started in earnest at the time of link removal and the disavow and accelerated at the time of the launch of the upgrade. Thanks, Alan XjSCiIdAwWgU2ps e5DerSo tYqemUO
Intermediate & Advanced SEO | | Kingalan10 -
Robots.txt help
Hi Moz Community, Google is indexing some developer pages from a previous website where I currently work: ddcblog.dev.examplewebsite.com/categories/sub-categories Was wondering how I include these in a robots.txt file so they no longer appear on Google. Can I do it under our homepage GWT account or do I have to have a separate account set up for these URL types? As always, your expertise is greatly appreciated, -Reed
Intermediate & Advanced SEO | | IceIcebaby0 -
Google contradictory communications about manual action being applied
Hello,
Intermediate & Advanced SEO | | mylittlepwny
we received a manual action (partial match) for pure spam for a site of ours. The date is not sure, because we didn't receive any notification in mail or inside Google Webmaster Tools dashboard, so all we can say for sure is that we noticed that the manual action page wasn't empty anymore in 10/03/2013. Some context: our Google traffic got a big hit on 07/20/2013, losing around 60% out of 250k visits per day. At first we thought it was an algorithmic penalisation related to Panda update. It already happened a few times in the past: losing part of Google traffic and having it back usually a couple of months after, often even better than before. We were really surprised at first to be deemed as pure spam given that the domain is ours since it was created 7 years ago, that we have never employed black hat techniques and that our efforts were always put into building valuable pages for users instead of using spam techniques to deceive them. But after noticing the manual action, we obviously thought that this was the actual reason for our traffic sudden drop. So we tried to figure out from the 4 URLs that Google reported as examples of the pure spam affected pages, what issues on our site could have been misinterpreted for pure spam. We also checked all the webmaster guidelines and fixed the issues we thought we could not be fully compliant with. All this process lasted for 3 months, after which we submitted our reconsideration request on 12/16/2013.
On 01/07/2013 we got the following answer: We've reviewed your site and found no manual actions by the webspam team that would directly affect your site's ranking in Google's search results. You can use the Manual Actions page in Webmaster Tools to view actions currently applied to your site.
Of course, there may be other issues with your site that could affect its ranking. Google determines the order of search results using a series of computer programs known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking will happen from time to time as we make updates to present the best results to our users.
If your site isn't appearing in Google search results, or if it's performing more poorly than it once did, check out our Help Center to identify and fix potential causes of the problem. Now we are really puzzled because Google is saying 2 opposite things: We still have a pure spam manual action, and we don't have a manual action (as per their newest response to our reconsideration request).
We could find online a few cases somehow similar to our own, with Google apparently giving contradictory communications about manual actions, but none of them helped to build a clear explanation. I don't want to enter into the merits of the reasons of the penalisation or whether it was or wasn't deserved, but rather knowing if anyone had the same experience or has any guess on what happened.
What we could think of is some bug or problem related to synching between different pieces of Google but still, after some days, the manual action notice is always there on Google Webmaster Tools and nothing changed in our traffic. We are now thinking about sending a second reconsideration request asking to update our Google Webmaster Tools manual actions page accordingly to our current actual status.
What do you think? thank you very much0 -
Micro Site Penalty?
I have been carrying out On-Page optimisation only for a client www.shade7.co.nz. After three months or so I have been getting some great results, improving to the top three positions for at least 30 of 45 keywords targeted. Couple of more tweaks and I would be a very happy camper. Disaster overnight! Rankings CRASH! Unbeknown to me the client a month or so back decided to link just about every product/link on a micro site he owns (www.shademakers.com/ ) plus one other site he owns. Explorer I think discovered over 350 back-links (follow) from these sites! As this is a site he owns and it is targeting the same keywords I presume this falls into the EVIL bucket of SEO. Two part question do you believe I am correct that this is the reason for this rankings crash and what would be the best way to resolve this! server-side 301 redirect for the micro site? Delete the micro site (drastic measure) Remove all the links other than maybe one in the contact page saying visit our other site shade7 other options? The client or I have not received any bad link Emails from Google.
Intermediate & Advanced SEO | | Moving-Web-SEO-Auckland0 -
Would this drop indicate a manual penalty?
Website short link: f c w . i m (copy and remove the spaces) A few weeks ago now we dropped from around page 2 all the way to around page 14 for they keyword watches on Google UK. We have remained around the level of page 12-17 ever since. Other important keywords which we monitor have slowly moved from page 1 positions onto page 2 or the bottom of page 1. Of course this is really worrying us as we are an e-commerce website and we are in peak season. Natural suspects would be duplicate content issues, crawl issues or bad links. All of which we have looked into and spent the past month improving to the best of our ability. I have gone through almost all of the content on the website. We have our own written descriptions on our 5000 products and have identified a small amount with issues using Copyscape. We have lots of unique customer product reviews and we have our own unique blog. I have looked into Crawl Issues and fine tuned URL parameter settings, usage of canonical and added next and prev tags. All of the faceted navigation which shouldn't be indexed has been excluded through canonical for well over a month and again recently using URL parameters in WT. Our link profile is small and doesn't contain a lot of spam links - we have identified some and wish to get them removed but even so I don't think the small quantity of links (a lot of which are nofollow also) would justify dropping us over around 100 places for a clearly relevant keyword. The only other thing that might be an issue is a large number of on page links. This is partly due to drop down page navigation. All our pages are being indexed by Google though so I'm not sure if it is a problem. You could argue it dilutes page rank, but you would think Google's algorithms would take recurring page navigation into account somehow - removing it would probably be detrimental to our users. So really we wanted to see if any SEO experts could help me out with this. It seems to us that it is either something we have already identified (causing a lot more impact than we would expect following the latest Google updates) or something else. Maybe a manual penalty? Thanks if you read the whole thing! Didn't intend to write this much really!
Intermediate & Advanced SEO | | Scott.lucas1 -
Rel=alternate to help localize sites
I am wondering about the efficiency of the rel=alternate tag and how well it works at specifically localizing content. Example: I have a website on a few ccTLD's but for some reason my .com shows up on Google.co.uk before my .co.uk version of my page. Some people have mentioned using rel=alternate but in my research this only seems to be applicable for duplicate content in another language. If I am wrong here can somebody please help me better understand this application of the rel=alternate tag. All my research leads me to rel=alternate hreflang= and I am not sure that is what I want. Thanks,
Intermediate & Advanced SEO | | DRSearchEngOpt
Chris Birkholm0