Manual Penalty Reconsideration Request Help
-
Hi All,
I'm currently in the process of creating a reconsideration request for an 'Impact Links' manual penalty.
So far I have downloaded all LIVE backlinks from multiple sources and audited them into groups;
-
Domains that I'm keeping (good quality, natural links).
-
Domains that I'm changing to No Follow (relevant good quality links that are good for the user but may be affiliated with my company, therefore changing the links to no follow rather than removing).
-
Domains that I'm getting rid of. (poor quality sites with optimised anchor text, directories, articles sites etc.).
One of my next steps is to review every historical back link to my website that is NO LONGER LIVE. To be thorough, I have planned to go through every domain (even if its no longer linking to my site) that has previously linked and straight up disavow the domain (if its poor quality).But I want to first check whether this is completely necessary for a successful reconsideration request?
My concerns are that its extremely time consuming (as I'm going through the domains to avoid disavowing a good quality domain that might link back to me in future and also because the historical list is the largest list of them all!) and there is also some risk involved as some good domains might get caught in the disavowing crossfire, therefore I only really want to carry this out if its completely necessary for the success of the reconsideration request. Obviously I understand that reconsideration requests are meant to be time consuming as I'm repenting against previous SEO sin (and believe me I've already spent weeks getting to the stage I'm at right now)... But as an in house Digital Marketer with many other digital avenues to look after for my company too, I can't justify spending such a long time on something if its not 100% necessary.
So overall - with a manual penalty request, would you bother sifting through domains that either don't exist anymore or no longer link to your site and disavow them for a thorough reconsideration request? Is this a necessary requirement to revoke the penalty or is Google only interested in links that are currently or recently live?
All responses, thoughts and ideas are appreciated
Kind Regards
Sam
-
-
Thanks again for your response Gary.
With regards to how many reffering domains and backlinks, it depends on how much i trust various bits of software (eg. Majestic SEO) when they tell me if the link is live or not.
In total there's about 3,200 referring domains historically with over 350,000 backlinks (lots of spam). Looking at whats live today, thats about 600 domains and 30,000 backlinks or so.
So far I've audited all links (from whats live) into keeping, changing to no follow or removing. Ive reached out to all no follows successfully and I've justified in depth the list of domains I'm keeping. I'm now in the process of reaching out to the poor quality links (first wave) and have covered about 200 referring domains.
The main question here is just exactly what to do with the rest of the links that majestic and GWT are telling me are no longer live (after checking some examples, there are some live that say they aren't live on majestic). Initially I was just going through them and throwing poor quality ones (even if they no longer link) straight into the disavow file to be safe. But since, I've worked with my developer to create a script to check which of the 2,500 none live domains are still live (and therefore cutting down my time considerably).
So overall, I am confident with my approach on links that are live (as this is the standard approach) and I am being as thorough as is possible. But when I wrote this question initially I was unsure whether I had to deal with the 'none live' domains (mainly because I didn't know whether to fully trust Majestic when its saying that they're not live) and so I wanted to check whether it was something I needed to do because it would be extremely time consuming.
Hopefully you understand where I'm coming from with this?
Sam
-
Thanks for your response Richard.
This is however an extremely generic response to quite a specific question. I didn't ask what a reconsideration request does!
-
So sorry for the delay getting back to you, its been a crazy week and didnt notice the response.
"Note that this is a manual penalty though, so fortunately no waiting for Penguin refreshes."
OK, just to let you know, once they lift the manual penalty, you still need to wait for a Penguin refresh. my penalty was lifted in May 2013 the vast majority of crap links had not been crawled and took a very long time for Google to do so. For the disavow file to take effect it needs to crawl each of those pages with your disavow file in mind and change them to a nofollow. Once a healthy amount is crawled you will then be in good standing when the Penguin algo is run. If Penguin runs before you have an acceptable level of healthiness you will not be released form Penguin and will have to wait for the next. So it took us until Oct 17th 2014 for us to finally get released. This was WITH John Muellers help!
My advice is don't be too picky with what you keep. Go through everything, mine was 20,000 Referring domains with 250k links! We had a 10 year history of business online and at one point also attacked with negative seo. So was a big job
"Providing I've given all possible evidence I can about the links being live or not to Google, do you think that disavowing all poor quality links that APPEAR to be no longer live is good enough in Google's eyes? Obviously for all links that are still live (as far as i can see) I have outreached to at least 3 times and disavowed if I can't get in touch."
Yes, create a report to show the work you have done, whats removed, who you have contacted, who did not respond. I did an Excel spreadsheet, one domain per line, with a few fields like, last contacted, date, removed etc..
There are lots of programmes out there that help with this now. Not so easy when your the first and there are no tools for it!
Also its best to do domain instead of links, how many links do you have pointing to your site?
-
A good reconsideration request does three things:
- Explains the exact quality issue on your site.
- Describes the steps you’ve taken to fix the issue.
- Documents the outcome of your efforts.
-
Actually, I agree with you. What you're describing are sites that look like the link has been deleted, but where the link actually still exists. My answer was regarding sites where the link actually has been deleted and doesn't exist.
-
Thanks for your response Gary.
That does make sense and to be honest is something that worries me! I am putting faith into software here (ie. I haven't gone through every single domain manually and checked that the link is still live) which is telling me whether the link is still live or not. If Google's software tells them otherwise when they review my reconsideration request, then all my other efforts are most likely wasted. I take it from this that you would advise addressing the none active domains too?
Note that this is a manual penalty though, so fortunately no waiting for Penguin refreshes.
Providing I've given all possible evidence I can about the links being live or not to Google, do you think that disavowing all poor quality links that APPEAR to be no longer live is good enough in Google's eyes? Obviously for all links that are still live (as far as i can see) I have outreached to at least 3 times and disavowed if I can't get in touch.
cheers
Sam
-
Sorry I have to disagree,
There are many sites, specifically directory sites that list websites and as more sites get listed they push your link to page 3, 4, 5. It looks like the link does not exist but it does on another page.
Some sites are that are crappy also have poor connections/bandwidth etc... So they go up and down and overload all the time. Just because its down now does not mean its down later when Google crawls it.
When I did my now famous! link clean up these were both issues that came up when I got help from John Mueller at Google.
It sucks because its just a hell of a lot of work, but based on how long it takes for a penguin update to come about, I would make sure you get it right FIRST TIME or you could wait more than a year to see returns.
Feel free to ask me anything.
Best of luck
Gary
-
Yes, I would be very surprised if Google wanted you to do anything with links that no longer exist.
-
Thanks for your response, Adam.
Would you say the same for domains that are still live but no longer contain links to your site?
Thanks
-
No, I would not spend time on links/domains that no longer exist. (I've never heard of that being necessary.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Errors Help - 350K Page Not Founds in 22 days
Got a good one for you all this time... For our site, Google Search Console is reporting 436,758 "Page Not Found" errors within the Crawl Error report. This is an increase of 350,000 errors in just 22 days (on Sept 21 we had 87,000 errors which was essentially consistently at that number for the previous 4 months or more). Then on August 22nd the errors jumped to 140,000, then climbed steadily from the 26th until the 31st reaching 326,000 errors, and then climbed again slowly from Sept 2nd until today's 436K. Unfortunately I can only see the top 1,000 erroneous URLs in the console, of which they seem to be custom Google tracking URLs my team uses to track our pages. A few questions: 1. Is there anyway to see the full list of 400K URLs Google is reporting they cannot find?
Intermediate & Advanced SEO | | usnseomoz
2. Should we be concerned at all about these?
3. Any other advice? thanks in advance! C0 -
AJAX requests and implication for SEO
Hi, I got a question in regard to webpages being served via AJAX request as I couldn't find a definitive answer in regard to an issue we currently face: When visitors on our site select a facet on a Listing Page, the site doesn't fully reload. As a consequence only certain tags of the content (H1, description,..) are updated, while other tags like canonical URLs, meta noindex,nofollow tag, or the title tag are not updating as long as you don't refresh the page. We have no information about how this will be crawled and indexed yet but I was wondering if anyone of you knows, how this will impact SEO?
Intermediate & Advanced SEO | | FashionLux0 -
How to avoid Google penalties being inherited when moving on with a new domain?
Looking for SEOs who have experience with resetting projects by migrating on to a new domain to shed either a manual or algorithmic penalty. My questions are: For algorithmic penalties, what is the best migration strategy to avoid inheriting any kind of baggage? 301, 302, establish no connection between the two sites? For manual penalties, what is the best migration strategy to avoid inheriting any kind of baggage? 301, 302, establish no connection between the two sites? Any other input on these kind of reset projects is appreciated.
Intermediate & Advanced SEO | | spanish_socapro0 -
Canonical Tag help
Hello everyone, We have implemented canonical tag on our website: http://www.indialetsplay.com/ For e.g. on http://www.indialetsplay.com/cycling-rollers?limit=42 we added canonical as http://www.indialetsplay.com/cycling-rollers?limit=all (as it showcase all products) Our default page is http://www.indialetsplay.com/cycling-rollers Is canonical tag implementation right? Or we need to add any other URL. Please suggest
Intermediate & Advanced SEO | | Obbserv0 -
Alternative Markup Challenge. Can anyone help?
I have a challenge around alternative markup. We currently operate a single domain with geo-targeted folders and alternative markup implemented. We are now now looking to expand this out to non-English content. Current Implementation; All generic English language content hosted on the main domain, with x5 English language content variations (locales) available under a folder structure (.com/en-us/ etc.). Alternative markup is in place for all locales within the HTML, implemented automatically by developers via the CMS. Locale folders geo-targeted via GWT and Bing WT. Planned Launch; Introduction of 5 new non-English locale folders (e.g. /de-de/ etc.), targeted to their respective country and language. Content language will be mixed, with around 1/10 of pages translated and the other 9/10 of pages (business listings) having their body content remain in English, with headers / footers translated. Locale folders will be geo-targeted via GWT and Bing WT. Folder and markup usage TBC. Options; Folders; Implement folder structure /de/, attempting to indicate country but not language (issue; usually a single identifier indicates language, not country?). Implement /de-de/ folder structure to match the English locales and maintain correct country targeting (issue; some content is not in language). Alternative markup; Do not make use of markup at all. Implement CMS based automated markup on all English and non-English content throughout the locale (e.g. /de-de/), but exclude English language versions (e.g. /en-gb/). Attempt manually implementing markup to bridge the English and non-English locales, potentially creating future issues with new content going live and content being removed. A heavy risk. Current approach is webmaster tools targeting, a /de-de/ folder structure and automated implementation of markup. This means English language URLs will have markup and non-English language URLs will have markup, but they will not match up (e.g. English pages will never have markup for non-English language content). If you minds haven't melted, what's your thoughts? Any help is much appreciated.
Intermediate & Advanced SEO | | HelloAlba0 -
Site Penalty After Changing Hosting Companies?
In one week's time, we've dropped from #3 on Page 1 of Google to Page 7 (similar on Bing). It looks like our traffic started to drop on 9/5 to 9/7 and has been a steady, rapid decline ever since. 1000s of pages are indexed, just suddenly ranking poorly -- even for branded terms. History:
Intermediate & Advanced SEO | | ddwilliamson
--In January, we switched to a web redesign & new domain
--In August, our hosting server was slow & kept crashing so we migrated our site to a new hosting company. We're not currently using the old hosting server. All domains, redirects, .htaccess files should now be correct and site speeds are improved.
--In early September, our NEW hosting company had a DNS issue causing more slow speeds and downtime for about 1 wk. Originally they thought it was htaccess so they changed our htaccess file - no luck - then discovered it was DNS. DNS issue was finally resolved on September 6th -- one day before the penalty/traffic issue seemed to begin.
-- According to GWMT, it looks like there were crawls completed around 9/4-9/5 What we've tried:
--Webmaster Tools - Googlebot dropoff since 9/5 (see attached screenshot). Nothing flagged. No site health alerts. Fetch as Google works. No manual webspam actions found.
-- W3C link checker, screamingfrog SEO spider, Xenu Link Sleuth, OSE (found some 4xx errors so we've updated those links)
-- Majestic SEO - backlinks reviewed 9/3 to 9/8
-- spoke to two different Adwords salespeople; unable to help
-- Bing Webmaster Tools
-- not showing organic search traffic since 9/6
-- 15% fewer pages crawled this month
-- top keywords are very odd -- stuff like "mt1 google apis" and "aaremel"
-- there are 4xx crawl errors under Crawl Information. We've fixed those URLs but they still appear in Webmaster Tools
-- some missing h1's and meta's, and dup titles, which we're working to fix
-- spike in crawl errors 9/11-9/12 and again on 9/14-9/15 It's been one thing after another this year, but all issues are now resolved with the exception of this newly-discovered penalty. We also have sites on a separate hosting server (with a different hosting company) that rank just fine. googlebot-crawls.jpg0 -
Help diagnosing a complex SEO issue
Good evening SEOMoz. A series events, in close succession are making it somewhat difficult for me to diagnose a cause of fluctuations in traffic. Please excuse some of the stupid moves I made, but desperation got the better of me. One of my most beloved websites was hit by Panda on January 18th. Pretty sure it was due to a CMS bug that is now fixed. The website site started to show great signs of recovery from April 19th - Panda 3.5. I'm going to be as explicit as possible with the traffic for the days that follow. Traffic was stable previously. April 20th +10%. April 21st +5%. April 22nd +5%. (half way recovered, also the first real fluctuation since the site was hit in Jan). Due to the looming over-optimisation penalty, on the 22nd I changed the titles to unoptimise them a little. (fear is a dangerous thing at times). April 23rd -10%. April 24th -10% April 25th onwards, pretty much levelled out. The websites I've seen hit by Penguin, lost around 40% of their traffic, very steeply on 24th and 25th April. So the drops aren't in keeping with my experience of Penguin. But they do coincide perfectly with the massive site-wide title change. I've haven't read anything definitive about a penalty for changing titles too often, but for obvious reasons, it makes sense. The drop seems terribly soon after changing titles, but the site is very heavily indexed. It's also worth mentioning that I did changed the titles BACK, incase it was purely the fact the titles had been slightly de-optimised, that caused the drop. I waited until May 5th. This had no positive nor negative effect. It's a lot to take in but I'd love to hear your thoughts. I'm feeling a little bamboozled looking at all the figures. There was of course the above the fold update on the 19th Jan, but lets ignore that as we've only ever had a max 1 ad per page, most pages have none.
Intermediate & Advanced SEO | | seo-wanna-bs0 -
Multiple language websites help in pagerank?
My website is in portuguese. If I make a english version and include it at google.com, it will increase my overall page rank?
Intermediate & Advanced SEO | | Naghirniac0