Disavow Links & Paid Link Removal (discussion)
-
Hey everyone,
We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario:
- We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor.
- Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed.
- Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings.
**The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients.
**The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty.
The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
-
Definitely just disavow. John Mueller from Google said in a hangout that you should not be paying for link removal unless for some reason you feel that you have inconvenienced the site owner and feel that you ought to pay for the link to be removed. In the same hangout a Google employee, Mariya said, "No! Don't pay for link removal! That's what the disavow tool is for." I've transcribed the video and given my thoughts on it here: http://www.hiswebmarketing.com/should-you-pay-for-link-removal/
-
Totally agree with everyone here. I wouldn't, under any circumstance, pay for a link to be removed. I was reading a blog post written by Google the other day about it. http://googlewebmastercentral.blogspot.co.uk/2012/07/new-notifications-about-inbound-links.html
Matt Cutts says in the post "In a few situations, we have heard about directories or blog networks that won't take links down. If a website tries to charge you to put links up and to take links down, feel free to let us know about that, either in your reconsideration request or by mentioning it on our webmaster forum or in a separate spam report. We have taken action on several such sites, because they often turn out to be doing link spamming themselves."
Google are good at spotting these types of links and not counting them especially if there is a strong backlink profile. I'd just disavow at domain level.
-
Thanks Rand,
I appreciate the feedback. I think our approach to this issue is more clear now - we'll include some documentation to hopefully prevent others from being extorted.
-
Definitely agree with Rand. When you submit your requests, send Google a note saying that the person is trying to get you to pay to have the links removed, possibly even including the email/text that stated he wanted you to pay. I doubt it will take them long to respond. I would NOT pay the person a dime. Submitting the request via the clients webmaster account should take care of the damage.
"That still leaves the issue of returning keyword rankings back to 'normal'. I'm still wondering what effect physically removing the links (and coughing up the cash) would have versus submitting a disavow file for all low quality directories in the client's profile."
Google's disavow tool is made for this. Otherwise, a competitor could submit your site to as many bad places as they wanted, and there wouldn't be anything you could do about it. As long as you submit a complete report of all the links in question, you should be fine.
"We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick."
Ninja? More like a clown, lol.
-
Yeah, disavowing should have the same effect as if the links were removed, so you're better off submitting the disavow.
-
Hey William,
Thanks for the reply. The disavow option seems to be pretty popular from what I've gathered so far - I agree with you about the financial part of the process feeling a little extort-y.
That still leaves the issue of returning keyword rankings back to 'normal'. I'm still wondering what effect physically removing the links (and coughing up the cash) would have versus submitting a disavow file for all low quality directories in the client's profile. Presuming most of the directories have been adjusted algorithmically to provide almost no SEO value - it seems to add more points in going the disavow route.
-
I'm in agreement with William. If you proactively submit the disavow file, you should be protected. I'd also think about sending a note via Webmaster Tools to let Google know about the network and that this person is extorting you/your site by forcing payment to remove links. That may help others whom Google might penalize for this in the future if they refuse to pay (and paying it forward like that is a great way to serve the web community and discourage future spam extortionists).
-
Just disavow. Don't let people like this extort you. If you want to get him to try and remove the links for free, tell him you're not going to pay him, and instead you're going to submit a disavow, flagging his entire network to Google as unwanted links. You made a good faith effort by contacting the webmaster, but being extorted goes beyond good faith.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
Intermediate & Advanced SEO | | lcourse
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?0 -
Directory links with no follow
Hi I'm researching competitor backlinks & they have a lot of directory links which are no follow - but they rank very well. Is this type of link building even allowed by google? I know they they aren't allowed followed directory links, but will no following them help with rankings?
Intermediate & Advanced SEO | | BeckyKey0 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Links from new sites with no link juice
Hi Guys, Do backlinks from a bunch of new sites pass any value to our site? I've heard a lot from some "SEO experts" say that it is an effective link building strategy to build a bunch of new sites and link them to our main site. I highly doubt that... To me, a new site is a new site, which means it won't have any backlinks in the beginning (most likely), so a backlink from this site won't pass too much link juice. Right? In my humble opinion this is not a good strategy any more...if you build new sites for the sake of getting links. This is just wrong. But, if you do have some unique content and you want to share with others on that particular topic, then you can definitely create a blog and write content and start getting links. And over time, the domain authority will increase, then a backlink from this site will become more valuable? I am not a SEO expert myself, so I am eager to hear your thoughts. Thanks.
Intermediate & Advanced SEO | | witmartmarketing0 -
How to minimalise links in your footer
Hi guys, I'm working on the website to improve the internal linking structure. We have thousand of pages, and on every single page we have the same footer with the same links. For this reason I would like to change the footer in only relevant links for the user, but also for the robots. So for the user I leave in the general main links Home / Contact / Promotions and customise a part of the links to specific links about the section they are looking at. Now my idea was to add to the General Main links a Nofollow, so I direct the robots in a better structure about how to read the website. I have been reading a lot about internal linkbuilding- like http://www.seomoz.org/blog/smarter-internal-linking-whiteboard-friday and http://www.seomoz.org/learn-seo/internal-link http://www.searchenginejournal.com/information-architecture-rocket-science-simplified/22503/ and a lot more, too much to display all. but my question would be, is it smart to internally start using NOFOLLOW's on links. because I do found also some negative comments on this approach http://www.dashboardjunkie.com/noindex-nofollow-canonical-and-disallow I hope to get some feedback from the community to make up my mind.
Intermediate & Advanced SEO | | Letty0 -
Bad links
Well just set up SEO Moz to find out someone thought it funny to build a load of links to our site http://bluetea.com.au/ with the anchor txt "Buy Cocks" .... PLEASE PLEASE let me know how much I should worry about this and how can I get rid of it?
Intermediate & Advanced SEO | | Intrested0 -
Maximum number of links
Hi there, I have just written an article that is due to be posted on an external blog, the article has potentially 3 links that could link to 3 different pages on my website, is this too much? what do you recommend being the maximum number of links? Thanks for any help
Intermediate & Advanced SEO | | Paul780 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0