Google launches their Disallow Tool
-
-
Hi Irving,
Today i have Compiled a comprehensive backlink report for one of my hotel client (http://www.fairfieldinnhotelcedarrapids.com/).I have identified all bad links and create one .txt file and upload via google disallow tool. now my question is, How do i know that all dead links has been removed? Is there any way to know status?
Thanks
-
If you Disavowed "good" links to your site, your rankings may be lowered as a result.
If you Disavowed "bad" links, then your Penguin issue could be reduced or resolved.
The best course of action is to allow a trained SEO professional examine each linking domain to determine if the links violate Google's Guidelines and only Disavow the links which do violate those guidelines. Also, the Disavow tool should not be used until after every possible action has been taken to remove the link. Google is quite clear on this topic. If you use the Disavow tool without "significantly" reducing the manipulative links to your site, it likely will not help.
-
I admit it... I panicked and disavowed a ton of domains that I didn't recognize as good links and I went from mid-page-2 to lower-page-3. So...my question is: what happens if I submit a new disavow file that has only a fraction of the links. Or, better yet, what if I delete the disavow file altogether with a file that has a comment saying Oops, sorry, we didn't know what we were doing with this tool and respectfully request to undo our mess".
A very good SEO told me to not chase after my disavow list, and I get what he's saying, but it's hard to not remember the days pre-disavow when I was at least on page 2 ... I was on page 1 for 14 years since 1998 until penguin hit! (I sell bean bag chairs and am speaking of serps for "bean bag chairs").
www.ahhprods.com in case anyone is curious
Thanks!
-
So far my webmaster response is about 10%, so you have no idea how much this tool can relieve some pain after so many attempts.
To the best of my knowledge, the Disavow Tool will have absolutely no impact on your success rate. It seems designed to help ensure webmasters who have a manual penalty lifted are not affected by Penguin.
-
This is gonna be interesting how everything works out.
I've sent a reconsideration request a day or two prior to the release. I will see how that request goes and then update the spreadsheet with new links I've removed as well as using disavow.
So far my webmaster response is about 10%, so you have no idea how much this tool can relieve some pain after so many attempts.
-
The damage was done before i came on the picture and there is no stats from before. the site rangs below other sites owned by the same clent with much weaker linking profiles. As i mentioend the site has the best of links, this is why i am prepared to get rid of any links that look even the slightest doggy. The site should rank number 1 when you look at the competition
I just dont have the time or budget to try to ask for removal, a attemp was made long ago with no sucess.
-
Very busy with a new project out of Arizona.
i have been following Mitt closely. He might just get over the line.
-
-
Thanks... Should have read the YouTube description.
-
This says to me that your site was spared an across the board penalty, but your rankings for specific keywords that have been overused in your anchor text have been suppressed. I would look at your incoming link anchor texts and see which one(s) you are no longer ranking for.
-
Excellent sir.
We will know pretty soon how everything shakes out once people start reporting back, but my suggestion would be that if you have a site that is not penalized you should NOT use this tool as an effort to try and clean up any spammy back links and clean up your good-to-bad backlink ratio. The reason is, this is a tool to be used as a last effort in trying to come back from a penalty when there are some links you tried to remove but simply cannot.
Sending this report will put eyeballs on your site and bring unnecessary attention to your site. Why ask Google to review your backlink profile and look at the nastiest links pointing to your site if your site is currently healthy.
An exception to this rule I think would be if you notice you are clearly under a negative SEO attack. Then it would make sense to be proactive.
-
Very solid analysis Ryan, good stuff.
-
Long time comming and quiite a messy interface. why they could not do somthing like Bing did with there tool is a mystery.
I have a client with a unatural link warning, saying "for this specific incident we are taking very targeted action on the unnatural links instead of your site as a whole"
to me this sounds like these links have been discounted anyhow and that the site is not punished, and maybe no need to do anything, but then goes on to say "If you are able to remove any of the links, please submit a reconsideration request, including the actions that you took."
so that makes me think i do need to do something. not very clear.
This client has a lot of very good links from CNN, NYT and a host of others, but partisipated in a link wheel. The blogs in this link wheel are real blogs rather than your obvious mass made for links blogs and makes it hard to identify what are what. i am thinking to disavow anything thats could be doggy, he has such good links I think it better to have a few false positives rather then leave any bad links in the profile.
Back to my first point, i was hoping for a click and job done approch link in BWMT.
-
Nice write-up Ryan, thanks.
Looks like an aggressive tool, I can see a lot of Webmasters running into trouble with this one.
If you contact a blog to get a link removed and then realise after actually I have made a mistake, or you login to a directory and remove it yourself and realise after you have made a mistake you can work on getting it back.
I get the feeling if you don't go through the process Ryan put down and you make a mistake with the Disallow Tool you wont be able to get those links back.
-
You are so right Ryan! This tool is not a shortcut at all. I fear that a lot of webmasters who have an unnatural links warning are going to jump straight to the disavow tool and ignore the actual reconsideration request process. As Matt says in the video, you still need to make a thorough attempt at trying to get the links removed on your own in order to have a manual penalty revoked.
-
Thanks for opening this discussion Irving. I have had calls from clients today regarding this "change" and it seems many site owners are simply caught up in the idea without realizing the true impact of this change. Resolving a manual Google penalty for manipulative links involves 4 steps prior to the release of this tool:
1. Compile a comprehensive backlink report. Many sites which suffer from a manipulative link penalty are absolutely doomed to have their Reconsideration Requests declined before they are even submitted. Why? Because they have not captured all the links to their site. You cannot rely on any single tool or even 2 tool combination. For each client I work with we compile a list of every known backlink to their site. How? By combining Google + Bing + OSE + Majestic + AHREFs data. Each data source offers links the others do not seem to find.
2. Properly identify all the manipulative links to the target site. Once again, many site owners repeatedly fail their Reconsideration Request and have no real chance at success because they try to take the easy way out. Attempting to replace real effort with fake work is what caused the penalty in the first place.
a. A thorough understanding of the difference between an organic link and manipulative link is required. In short, you must calibrate your understanding of links to match Google. How do you view free directory links? The reality is 99%+ of them are manipulative. How about press releases? Do you think most press releases are organic links? When site owners pay another company to publish articles they wrote with links back to their site, does that sound natural to you?
b. How about broken links? Can you use an automatic link checker and then if the link is not on the URL simply cross it off the list? In a significant percent of cases the link has simply moved to another page on the linking site. Some sites have very dynamic link structures where one day a URL is at ?page=20 and the next it is at ?page=21. Other sites make URL changes over time. You must search each site using their search widget and a Google site: search before assuming the site's link is gone.
c. Is the link marked NoFollow? You need to keep searching the page to ensure there are not other followed links on the same page.
The above are just some examples of gaps in the process of many who attempt to resolve this type of penalty. The disavow tool's introduction does not impact this step.
3. Webmaster Outreach. Once you have a comprehensive list of all known links to your site and have properly identified all the manipulative links, there is a need to contact every site on the list. Another common issue is those attempting to resolving a manipulative link issue give up far too easy. Site owners can be contacted via their WHOIS email address, the email address on their site AND the contact form on their site. You can call them, send a letter and chase them down on social networks. This type of sincere effort can lead to 50%+ reduction in links to your site.
Once sincere and comprehensive efforts have been made to remove the links, Google can clearly tell because there will be a "significant" reduction in manipulative links. At that point, THEN the Disavow tool can be used.
4. Filing a thoroughly documented Reconsideration Request. Three days later, the Reconsideration Request can be submitted.
So the introduction of this tool actually did not reduce any step in the process at all. Matt clearly outlined Google's expectation the tool is only used after a webmaster outreach campaign has been completed. If you expect to be able to simply submit a list of links without webmaster outreach, you are likely going to be disappointed.
Watch the first 2 minutes of the video a few times. Matt clearly says "...when you have contacted each webmaster multiple times....and there are only a small fraction of links left....that is when you can use the tool."
-
Nice summary at SEWatch:
http://searchenginewatch.com/article/2217602/Google-Disavow-Links-Tool-Now-Available
I'm curious about the "Most sites shouldn’t use this tool,” Cutts said. “Use caution." caveat. I've basically got only one client out of many that I'd even need to consider using this for. But I can't help but imagine hyper paranoid SEOs trying to massage their link profile down to the last drop of relevance. My gut feel is that this is a 'last resort' tool, and not a 'everyday SEO' tool.
-
Based on what Matt said, it sounds like Google only wants you to use the tool for links that you've tried to remove manually but couldn't. My guess is they may ignore your disavowals if you rely too much on the tool.
-
The links are in the YouTube video description.
"Access the feature here:
https://www.google.com/webmasters/tools/disavow-links-main" -
Cutt's did not indicate how to download the tool. Did I miss that?
-
Thanks for the heads up. Just watched the video.
-
Really looking forward to this tool... But Joeys question is Really important. Does anyone have suggestions?
-
Awesome tool. How can I tell which links to my site I should disavow? We get a bunch of random links per day that look spammy, but how can I tell for sure that removing them will help rather than hurt?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long Google will take to Disavow a link?
Just want to know how long will Google take to Disavow a link? I uploaded my file on 18 Dec 2020 and today is 5th January 2021 and still, that link is appearing in my Search Console in Top linking domains. Anyone who recently done this practice and how long it took? I mentioned the domain name below and hopefully, it will disavow all the links [subdomain+www+without www] coming from that domain. domain:abcd.com Help me out, please...
White Hat / Black Hat SEO | | seotoolsland.com0 -
Google spider
If someone provide 1 or more cent discount to our customers who put up a link on their site, and wanted to actually show the referral discount in their shopping cart for that customer, can Google see that and realize they are providing a discount for a link? Can Google see what's displayed in our their web application - like in the upload, shopping cart and complete transaction pages?
White Hat / Black Hat SEO | | K_Monestel0 -
How to stop google bot from crawling spammy injected pages by hacker?
Hello, Please help me. Our one of website is under attack by hacker once again. They have injected spammy URL and google is indexing, but we could not find these pages on our website. These all are 404 Pages. Our website is not secured. No HTTPS Our website is using wordpress CMS Thanks
White Hat / Black Hat SEO | | ShahzadAhmed0 -
Should I Even Bother Trying To Recover This Site After Google Penguin?
Hello all, I would like to get your opinion on whether I should invest time and money to improve a website which was hit by Google Penguin in April 2014. (I know, April 2014 was nearly 2 years ago. However, this site has not been a top priority for us and we have just left until now). The site is www.salmonrecipes.net Basically, we aggregated over 700 salmon recipes from major supermarkets, famous chefs, and others (all with their permission) and made them available on this site. It was a good site at the time but it is showing its age now. For a few years we were occasionally #1 on Google in the US for "salmon recipes", but normally we would be between #2 and #4. We made money from the site almost entirely through Adsense. We never made a huge amount, but it paid our office rent every month, which was handy. We also built up an email database of several thousand followers, but we've not really used this much. (Yet). In the year from 25th April 2011 to 24th April 2012 the site attracted just over 500k visits. After the rankings dropped due to Google Penguin, traffic dropped by 77% in the year from 25th April 2011 to 24th April 2012. Rankings and traffic have not recovered at all, and are only getting worse. I am happy to accept that we deserved our rankings to fall during the Google Penguin re-shuffle. I stupidly commissioned an offshore company to build lots of links which, in hindsight, were basically just spam, and totally without any real value. However they assured me it was safe and I trusted them, despite my own nagging reservations. Anyway, I have full details of all the links they created, and therefore I could remove many of these 'relatively' easily. (Of course, removing hundreds of links would take a lot of time). My questions ... 1. How can I evaluate the probability of this site 'recovering' from Google Penguin. I am willing to invest time/money on link removal and new (ethical) SEO work if there is a reasonable chance of regaining a position in the top 5 on Google (US) for "salmon recipes" and various long-tail terms. But I am keen to avoid spending time/money on this if it is unlikely we will recover. How can I figure out my chances? 2. Generally, I accept that this model of site is in decline. Relying on Google to drive traffic to a site, and on Google to produce revenue via its Adsense scheme, is risky and not entirely sensible. Also, Google seems to provide more and more 'answers' itself, rather than sending people to e.g. a website listing recipes. Given this, is it worth investing any money in this at all? 3. Can you recommend anyone who specialises in this kind of recovery work. (As I said, I have a comprehensive list of all the links that were built, etc). OK, that is all for now. I am really looking forward to whatever opinions you may have about this. I'll provide more info if required. Huge thanks
White Hat / Black Hat SEO | | smaavie
David0 -
Where can i see ejemple of disavow files to adapt mine in order to send to google
Can i send a disavow file to google as CSV file. Where can i see ejemple of disavow files to adapt mine in order to send to google
White Hat / Black Hat SEO | | maestrosonrisas0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
Google-backed sites' link profiles
Curious what you SEO people think of the link profiles of these (high-ranking) Google-backed UK sites: http://www.opensiteexplorer.org/domains?site=www.startupdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.lawdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.marketingdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.itdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.taxdonut.co.uk Each site has between 40k and 50k inlinks counted in OSE. However, there are relatively few linking root domains in each case: 273 for marketingdonut 216 for startupdonut 90 for lawdonut 53 for itdonut 16 for taxdonut Is there something wrong with the OSE data here? Does this imply that the average root domain linking to the taxdonut site does so with 2857 links? The sites have no significant social media stats. The sites are heavily inter-linked. Also linked from the operating business, BHP Information Solutions (tagline "Gain access to SMEs"). Is this what Google would think of as a "natural" link profile? Interestingly, they've managed to secure links on quite a few UK local authority resources pages - generally being the only commercial website on those pages.
White Hat / Black Hat SEO | | seqal0 -
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible?
Hi, I found that one of my competitors have zero backlings in google, zero in yahoo but about 50.000 in Bing. How is that possible? I assumed that all search engines would finde the backlinks. Besides that he ranks fair well and better than I do with only a single site and with only one article of content while I have a lot of content and sites. I do not undersdtand why he is ranking better in google, while google assumingly does not see any backlinks of the 50.000 bing is finding. Thx, Dan
White Hat / Black Hat SEO | | docschmitti0