Do you have to wait after disavowing before submitting a reconsideration request
-
Hi all
We have a link penalty at the moment it seems. I went through 40k links in various phases and have disavowed over a thousand domains that date back to old SEO work. I was barely able to have any links removed as the majority are on directories etc that no one looks after any more etc and / or which are spammy and scraped anyway.
According to link research tools link detox tool, we now have a very low risk profile (I loaded the disavowed links into the tool for it to take into consideration when assessing our profile). I then submitted a reconsideration request on the same day as loading the new disavowed file (on the 26th of April). However today (7th May) we got a message in webmaster central that says our link profile is still unnatural. Aaargh.
My question: is the disavow file taken into consideration when the reconsideration request is reviewed (ie is that information immediately available to the reviewer)? Or do we have to wait for the disavow file to flow through in the crawl stats? If so, how long do we have to wait?
I've checked a link that I disavowed last time and it's still showing up in the links that I pull down from Webmaster Central, and indeed links that I disavowed at the start of April are still showing up in the list of links that can be downloaded.
Any help gratefully received. I'm pulling my hair out here, trying to undo the dodgy work of a few random people many months ago!
Cheers,
Will
-
You seem to have a good handle on the issue but you might consider getting an experienced SEO in for at least a second opinion. We can only give very general help here on the Q&A, as we don't have access to your data
They do say to wait at least a few weeks for results
Cheers
S
-
Hi Stephen
I've been using the links downloaded from Webmaster (as directed to by Matt Cutts in one of his videos IIRC) plus also the data set from Link Research Tools. Is that insufficient? I've only got so many hours in the day as my day job is running this company...I figured taking the links that Google gave me would surely be enough...but these days who knows. G seems to want to make people jump through a lot of hoops...
-
Hey Marcus
Thanks for your input. Yeah, we have a lot of links but then we've been around for 7 years and weirdo scrapers and random replicants of DMOZ alone contribute a zillion links without us even having done anything. Not saying we didn't do link building back in the day (we did, just like everyone else, in what was at the time a white hat fashion but apparently no longer is) but we have had no permanent marketing team at all for the last two years as we've focused on some B2B parts of our business. So frustrating that bad links just kept growing and we're supposed to be responsible for them!
Anyway, as you say, will need to go in a bit harder I guess. eg just because a site is PR0, I didn't remove it before, as some random person with a no marks blog who used our birthday balloon picture on their blog didn't deserve to be disavowed as far as I thought. But, well, I can't take any chances now so will just have to bin anything under PR1 and take another look at links from themed websites (eg should I disavow other blogs that have added us to their blogroll unsolicited even if they're in our vertical? It's hard to tell. What about genuine flower directories? Who knows?).
What's really frustrating is that the whole message from Matt Cutts is "you really shouldn't use this tool" (ref disavow) as you could damage your site but 1. barely anyone takes links down when requested as far as I can tell and 2. given the amount of junk that's been pointed at our site that we're not responsible for (though we are are responsible for some), then I think the contention that very few people would need to use it is a bit optimistic and there's therefore a danger or people like me totally shooting themselves in the foot, given there are no clear rules on the grey areas I mention above.
PS understood that it's not some magic solution and we'll rank #1 for everything afterwards. I just want to get it cleared up and be able to get back to my day job. God knows how a smaller business than us would cope with something like this. Seems to me it pushes the advantage even further in the direction of bigger companies with the resources to manage a screw up like this.
Anyway, blah blah. Time to get the machete out.
-
In my experience, if you have this message again, you still have links they don't like. 35% of linking domains is not a great deal and as Stephen said, whilst Link Detox gives you a good starting place you really do have to audit these links in a brutal fashion.
You have 15000 external links from 2000 sites - that's a hell of a lot of links for a semi popular blog let alone a site that does not really publish any content that would attract links.
If you are holding onto links as you think they are 'ok' or because they 'don't look too bad' then you may need to get a whole lot more aggressive with what you remove.
Also, just because you remove the manual penalty, don't expect things to be amazing afterwards.
An alternative approach to finding the bad links and getting them removed is to identify the good ones and consider getting them repointed to a new URL and starting again with a rebrand / new URL. It can be easier to get a response from the good sites than it can be getting a response from the bad ones.
Failing that get a whole lot more aggressive with what you remove.
Hope that helps!
Marcus
-
How sure are you you have a full dataset of links? What did you use as you database for links to start cleaning from? (I would expect ahrefs, GWT, seomoz + majestic etc)
S
-
Well, I also went through all the links manually which was the world's most boring task, then followed up with a healthcheck. Gah.
We've disavowed about 35% of all linking domains now...
-
I doubt its a time thing, it's more likely that they still see dirty links that you have not disallowed
That's the problem with these jump one the bandwagon tools like Link detox et al - they give you a nice score but that doesn't mean anything
404ing burnt pages and starting again may be a much quicker process than messing around with link disavowal
How many domains were linking and how many domains did you disallow?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do after a failed request for validation (error with noindex, nofollow) in new Google Search Console?
Hi guys, We have the following situation: After an error message in new google search console for a large amount of pages with noindex, nofollow tag, a validation is requested before the problem is fixed. (it's incredibly stupid decision taken before asking the SEO team for advice) Google starts the validation, crawls 9 URLs and changes the status to "Failed". All other URLs are still in "pending" status. The problem has been fixed for more than 10 days, but apparently Google doesn't crawl the pages and none of the URLs is back in the index. We tried pinging several pages and html sitemaps, but there is no result. Do you think we should request for re-validation or wait more time? It there something more we could do to speed up the process?
Intermediate & Advanced SEO | | ParisChildress0 -
Google Indexing Request - Typical Time to Complete?
In Google Search Console, when you request the (re) indexing of a fetched page, what's the average amount of time it takes to re-index and does it vary that much from site to site or are manual re-index request put in a queue and served on a first come - first serve basis despite the site characteristics like domain/page authority?
Intermediate & Advanced SEO | | SEO18050 -
What is Google supposed to return when you submit an image URL into Fetch as Google? Is a few lines of readable text followed by lots of unreadable text normal?
I am seeing something like this (Is this normal?): HTTP/1.1 200 OK
Intermediate & Advanced SEO | | Autoboof
Server: nginx
Content-Type: image/jpeg
X-Content-Type-Options: nosniff
Last-Modified: Fri, 13 Nov 2015 15:23:04 GMT
Cache-Control: max-age=1209600
Expires: Fri, 27 Nov 2015 15:23:55 GMT
X-Request-ID: v-8dd8519e-8a1a-11e5-a595-12313d18b975
X-AH-Environment: prod
Content-Length: 25505
Accept-Ranges: bytes
Date: Fri, 13 Nov 2015 15:24:11 GMT
X-Varnish: 863978362 863966195
Age: 16
Via: 1.1 varnish
Connection: keep-alive
X-Cache: HIT
X-Cache-Hits: 1 ����•JFIF••••��;CREATOR: gd-jpeg v1.0 (using IJG JPEG v80), quality = 75
��C•••••••••• •
••
••••••••• $.' ",#(7),01444'9=82<.342��C• ••••
•2!!22222222222222222222222222222222222222222222222222��•••••v••"••••••��••••••••••••••••
•���•••••••••••••}•••••••!1A••Qa•"q•2���•#B��•R��$3br�
••••%&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz���������������������������������������������������������������������������•••••••••••••••••••
•���••••••••••••••w••••••!1••AQ•aq•"2�••B���� #3R�•br�0 -
Which links to disavow?
I've got a new client that just fired their former SEO company, which was building spammy links like crazy! Using GSC and Majestic, I've identified 341 linking domains. I'm only a quarter of the way through the list, but it is clear that the overwhelming majority are from directories, article directories and comment spam. So far less than 20% are definitely links I want to keep. At what point do I keep directory links? I see one with a DA of 61 and a Moz spam score of 0. I realize this is a judgement call that will vary, but I'd love to hear some folks give DA and spam numbers. FWIW, the client's DA is 37.
Intermediate & Advanced SEO | | rich.owings0 -
Directory concerns - am I right to request nofollow?
A client had taken a free trial on a directory - a niche directory which only takes food related websites. They mentioned, in passing, that the directory listing was replicated across 90 food-relevant "partner" sites [alarm bells!] - some of which use nofollow - some which don't, apparently. The main directory doesn't use nofollow and offers a mix of monthly-fee based listings or free listings. I've demanded a nofollow backlink from the main site and partner sites, or no backlink... what are your thoughts?
Intermediate & Advanced SEO | | McTaggart0 -
Disavow Links & Paid Link Removal (discussion)
Hey everyone, We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario: We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor. Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed. Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings. **The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients. **The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty. The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
Intermediate & Advanced SEO | | Etna0 -
A newbie to this..what is a good way to find local directories for your city or general directories that should be submitted to
A newbie to this forum...hope have put the question the right way What is a good way/source to find which directories are suitable for a business. How to identify directories which are more localised..
Intermediate & Advanced SEO | | grovermohit0 -
Best practice to disavow spammy links
Hi Forum, I'm trying to quantify the logic for removing spammy links.
Intermediate & Advanced SEO | | Mark_Ch
I've read the article: http://www.seomoz.org/blog/how-to-check-which-links-can-harm-your-sites-rankings. Based on my pivot chart results, I see around 55% of my backlinks at zero pagerank. Q: Should I simply remove all zero page rank links or carry out an assessment based on the links (zero pagerank) DA / PA. If so what are sensible DA and/or PA metrics? Q: What other factors should be taken into consideration, such as anchor text etc.0