Manual Removal Request Versus Automated Request to Remove Bad Links
-
Our site has several hundred toxic links. We would prefer that the webmaster remove them rather than submitting a disavow file to Google.
Are we better off writing web masters over and over again to get the links removed? If someone is monitoring the removal and keeps writing the web masters will this ultimately get better results than using some automated program like LinkDetox to process the requests? Or is this the type of request that will be ignored no matter what we do and how we ask?
I am willing to invest in the manual labor, but only if there is some chance of a favorable outcome.
Does anyone have experience with this? Basically how to get the highest compliance rate for link removal requests?
Thanks, Alan
-
I agree with Moosa here. When we went through this we used Link Detox to help identify the links we wanted to remove/disavow and RMOOV to send an automated email campaign. The response rate was less than 5%as I recall and usually took multiple emails if there was to be a response.
This is the nice thing about the tools as they track success for you. It's also a really good idea to use a "throw away"email address,as many of these may be reported by the recipients as spam and get your email account added to spam filters.I think the personal touch thing is more for outreach. Not worth the effort here.
Best!
-
Alan, if I would be at your place, I would have moved to a program like link detox instead of the manual labor and here are some reasons why!
- You are emailing to the real people so no matter what trick you use, there are chances that you may fail, especially if they have decided not to remove the links.
- The removal ratio can dramatically increase if you offer a small amount to remove a link but again disavow is a better and easy option that will help you save your time and money.
- Manual Labor to do a work that might or might not work is a bad investment in my opinion, on the other hand manual labor will be much more expensive as compare to a tool like Link Detox.
Link Detox will find bad links, email them and give you the list of bad links that contain your website link. You can get that data and create a disavow file and submit it to Google.
All in all, I understand your point but in my opinion it is not a very good investment.
Hope this helps!
-
Hi Alan
When I pull links, I do so from WMT, Majestic, OSE, and Ahrefs.
Reason being, you're going to see different links from different tools. No one source covers them all, so it's best to get as much data as you can from different places.
I will read into LinkDetox and tell you if anything is a red flag to me, but again, your statement from the other question thread seems like a lot money for automation and "too good to be true".
Please let me know if you have any more questions or comments - would love to help where I can and see you through! Best of luck!
-
Hi Patrick:
Thanks for your in depth response!! The expedite tools in Link Detox is described here: http://www.linkdetox.com/boost.
But if Google will now process disavow files in a few months as the MOZ blogpost your refer to states, I guess there is no point in using boast.
Our site never received a manual penalty from Googlebut did drop in ranking after the first Penguin in April 2012. Recover since then has been sporadic and uneven despite a major investment in SEO.
I have pretty much followed the procedure you describe. Only deviation is that I compiled the links from Google Webmaster Tools plus the Link Detox database. I wonder if we are missing a significant number of links by not sourcing AHREFs, MOZ. If I can identify 80-90% of the bad links I think it is sufficient. I don't expect 100% in removing them.
Thanks again for your assistance!!
Alan
-
Hi there
Based on some previous work I have done, webmasters are substantially more responsive to manual outreach and can definitely tell the difference.
Always include:
-
Their name
-
Both in the subject line and greeting
-
I like "Attn: (name) / Link Removal Request"
-
Their site domain name
-
Links to pages with examples of your link
-
Thank them for their time
-
Signature with proper contact information
Always respond to emails - good, bad, or indifferent - people respond to a real human being. Thank them for removal, kindly respond to apprehension or irritability, and answer (within reason) questions they may have. Do not be hostile back. I would usually send three emails:
1. Stating my reason for reaching out and where my link is located.
2. If I didn't hear back, about four days later, I would follow up. Again letting them know where my link is located.
3. If I didn't hear back, about 3-5 days later, I would let them know that this would be my last email before disavowing their link.Usually, I didn't make it to three. Remember to document and keep records of your outreach in case you somehow get a manual action - you'll need it.
Here is a great link removal resource:
Link Audit Guide for Effective Link Removals & Risk Mitigation (Moz)Always consider disavow files a tool and friend - they do work. If you can't get links removed and you fear a manual action, these will be your next line of defense - especially if you are dealing with hundreds of bad links.
Take the time to manually reach out to webmasters if you can - it will pay off. I also want to suggest LinkRisk as another tool to look into for your link audits and outreach. It has been a big help for me.
Hope this helps! Good luck!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Footer no follow links
Just interested to know when putting links at the foot of the site some people use no-follow tags. I'm thinking about internal pages and social networks. Is this still necessary or is it an old-fashioned idea?
Intermediate & Advanced SEO | | seoman100 -
Viewing search results for 'We possibly have internal links that link to 404 pages. What is the most efficient way to check our sites internal links?
We possibly have internal links on our site that point to 404 pages as well as links that point to old pages. I need to tidy this up as efficiently as possible and would like some advice on the best way to go about this.
Intermediate & Advanced SEO | | andyheath0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Links Questions and advice?
I have a website which has a fair few link assets that are doing very well (a lot of really powerful sites have link to them with follow links) but my commercial pages are not doing as well as a lot of sites without any other investment than (mediocre) links direct to there commercial pages with at least 10% of them carrying the money anchor text. Even pages we have had a few links for with generalized real anchor text and reasonable links do not do as well as the above due to none of them carrying the money keyword? Is it me or does google still rely on links to the commercial page and keywords with anchor text to match the money term?
Intermediate & Advanced SEO | | BobAnderson0 -
Linking to URLs With Hash (#) in Them
How does link juice flow when linking to URLs with the hash tag in them? If I link to this page, which generates a pop-over on my homepage that gives info about my special offer, where will the link juice go to? homepage.com/#specialoffer Will the link juice go to the homepage? Will it go nowhere? Will it go to the hash URL above? I'd like to publish an annual/evergreen sort of offer that will generate lots of links. And instead of driving those links to homepage.com/offer, I was hoping to get that link juice to flow to the homepage, or maybe even a product page, instead. And just updating the pop over information each year as the offer changes. I've seen competitors do it this way but wanted to see what the community here things in terms of linking to URLs with the hash tag in them. Can also be a use case for using hash tags in URLs for tracking purposes maybe?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Should We Link To Our News?
We just started an "In the News" section on our webpage. We are not sure what would be the best for SEO purposes. Should we link to the news websites that have the stories about our company, even if they have no link bank? Or should we just take screenshots of the news article and only link to articles that link back to us (this is what we a currently doing)? Here is our news page, http://www.buyautoparts.com/News/
Intermediate & Advanced SEO | | joebuilder0 -
Which duplicate content should I remove?
I have duplicate content and am trying to figure out which URL to remove. What should I take into consideration? Authority? How close to the root the page is? How clear the path is? Would appreciate your help! Thanks!
Intermediate & Advanced SEO | | Ocularis0 -
Link anchor text: only useful for pages linked to directly or distributed across site?
As a SEO I understand that link anchor text for the focus keyword on the page linked to is very important, but I have a question which I can not find the answer to in any books or blogs, namely: does inbound anchor text 'carry over' to other pages in your site, like linkjuice? For instance, if I have a homepage focusing on keyword X and a subpage (with internal links to it) focusing on keyword Y. Does is then help to link to the homepage with keyword Y anchor texts? Will this keyword thematically 'flow through' the internal link structure and help the subpage's ranking? In a broader sense: will a diverse link anchor text profile to your homepage help all other pages in your domain rank thematically? Or is link anchor text just useful for the direct page that is linked to? All views and experiences are welcome! Kind regards, Joost van Vught
Intermediate & Advanced SEO | | JoostvanVught0