Disavow Experts: Here's one for ya ....
-
Not sure how to handle this one. Simply because there are SO MANY .... I want to be careful not to do something stupid ...
Just a quick 3 minute video explanation: https://youtu.be/bVHUWTGH21E
I'm interested in several opinions so if someone replies - please still chime in.
Thanks.
-
No problem at all, happy to help. Unfortunately the best tools that we have to evaluate these are tools like Open Site Explorer which try to emulate how Google looks at links but they're imperfect for the very same reason that I can't possibly give you a definitive answer: Google doesn't want us to know!
Unfortunately, the only way we can ever know the outcome is to implement the change and see if the rankings get better or worse - welcome to the struggles of SEO!
If you really can't afford to be taking a hit right now but it would be more acceptable in a month or two (e.g. right now is your busiest period) I'd be inclined to wait. Otherwise, it's a tough call but I'd still lean toward having them removed. Don't forget that Google has been promising a Penguin (backlinks) update "very soon" all year! If that damn update finally rolls out tomorrow you may find yourself getting slammed by it... or it could roll out next year... or maybe it'll roll out and you'll be fine. Sigh.
We have had success in doing it steadily with one of our larger clients who were in a similar situation and the results were as good as we could have hoped for but YMMV. We essentially did the removal in stages. We divided the bad domains up into batches then contacted the first batch requesting removal then disavowing.
While all this was happening we also got to work building quality links to the site as well so they roughly cancelled each other out. Then we did the same thing with the other batches of bad links until we'd been through the lot.
For us, the end result was a series of fairly marginal peaks and troughs that directly correlated with link removal and link acquisition so the net position at any given time was approximately the same. I must stress though that YMMV here - since I have a total data sample of 2 domains (this client has 2 companies/sites), it's impossible for me to say with absolute certainty that what I saw is the direct result of our process.
-
Thank you so much. So that leaves the most important question. How do I know if these are benefiting me? I really can't afford to lose rankings right now, as we are in this situation due to already ruined rankings for an unknown reason. There are about 300 of them total. We have roughly 2,000 unique domains linking to us. So its a decent chunk.
Ironically their domain authority is "44" and mine is "45" .... the site has been online 16 years (with nary a design update apparently) ... their Moz domain authority is 37 whereas mine is 38. So ... I'm not sure if these guys are viewed terribly by Google or not...
There must be some way to ascertain what Google thinks of this site and its links... ?
-
The horrible thing about link removal is that it's often hard to give an accurate answer to this question. On one hand, directories, link farms etc are often ignored by search engines so having them may be doing you no harm. On the other hand, it's impossible to know if the specific domains you're looking at are actually being ignored or not.
In these scenarios I tend to lean towards having them removed anyway, just in case they are being counted. As you pointed out, there is a chance that removing them will remove some strength from your site and see you drop in rankings but since it's impossible to tell the outcome until it's too late, I'd rather risk being penalised for removing bad links than having them.
There are a few things you can do to make your life marginally easier here:
- Contact the site and ask them nicely to remove the links. They do have a phone number on the contact page, you'd be surprised how powerful a phone conversation can be vs yet another generic email.
- Export the list of referring domains (rather than links) and bulk-categorise in Excel as much as possible. Filter for words like fasthealth, seo, link, directory/directories etc and highlight them all for removal
- Disavow by domain rather than links. All you have to change in the disavow file is adding domain: to the beginning.
For example: domain:website.com.
If you do decide to give them a call or even email them, the best angle I've found is "I'm cleaning up the links in accordance with Google guidelines and have to be very picky with the ones I keep; this is no reflection on the quality of your site but I'd really appreciate it if you can remove them". Far more likely to get results than the attitude some people take of "hey scumbag, your horrible site is ruining my rankings, get rid of these spam links".
Also, the reason I say to export, evaluate and disavow at a domain level is simply a matter of volume. Rather than 20,000 spam links, you may only end up having to sift through 200 referring domains instead; far easier to manage. In my experience it's pretty rare that you'd want to disavow just one link from a site like these so doing it at the domain level disavows them all and protects you if they decide to change their URL structure in the future. A new URL structure would give you a link from a "new page" in the eyes of the search engine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it necessary to use Google's Structured Data Markup or alternative for my B2B site?
Hi, We are in the process of going through a re-design for our site. Am trying to understand if we need to use some sort of structured data either from Google Structured data or schema. org?
Intermediate & Advanced SEO | | Krausch0 -
301's - Do we keep the old sitemap to assist google with this ?
Hello Mozzers, We have restructured our site and have done many 301 redirects to our new url structure. I have seen one of my competitors have done similar but they have kept the old sitemap to assist google I guess with their 301's as well. At present we only have our new site map active but am I missing a trick by not have the old one there as well to assist google with 301's. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
What's the best possible URL structure for a local search engine?
Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
Intermediate & Advanced SEO | | _nitman0 -
Google's Stance on "Hidden" Content
Hi, I'm aware Google doesn't care if you have helpful content you can hide/unhide by user interaction. I am also aware that Google frowns upon hiding content from the user for SEO purposes. We're not considering anything similar to this. The issue is, we will be displaying only a part of our content to the user at a time. We'll load 3 results on each page initially. These first 3 results are static, meaning on each initial page load/refresh, the same 3 results will display. However, we'll have a "Show Next 3" button which replaces the initial results with the next 3 results. This content will be preloaded in the source code so Google will know about it. I feel like Google shouldn't have an issue with this since we're allowing the user action to cycle through all results. But I'm curious, is it an issue that the user action does NOT allow them to see all results on the page at once? I am leaning towards no, this doesn't matter, but would like some input if possible. Thanks a lot!
Intermediate & Advanced SEO | | kirmeliux0 -
Novice Question - Can Browsers realistically distinguish words within concatenated strings e.g. text55fun or should one use text-55-fun? What about foreign languages especially more obscure ones like Finnish which Google Translate often miss-translates?
I am attempting to understand what is realistically possible within Google, Yahoo and Bing as they search websites for KeyWords. Technically my understanding is that they should be able to distinguish common words within concatenated strings, although there can be confusion between word boundaries when ambiguity is involved. So in the simple example of text55fun, do search engines actually distinguish text, 55 and fun separately? There are practical processing, databased and algorithm limitations that might turn a technically possible solution into a unrealistic one at a commercial scale. What about more ambiguous strings like stringsstrummingstrongly would that be parsed as string s strummings trongly or strings strummings trongly or strings strumming strongly? Does one need to use dashes or underscores to make it unambiguous to the search engine? My guess is that the engine would recognize the dash or space and better understand the word boundaries yet ignore the dash or underscore from an overall concatenated string perspective. Thanks in advance to whoever can provide any insight to an old coder who is new to this field.
Intermediate & Advanced SEO | | ny600 -
How to find all of a website's SERPs?
Was wondering how easiest to find all of a website's existing SERPs?
Intermediate & Advanced SEO | | McTaggart0 -
Can you see the 'indexing rules' that are in place for your own site?
By 'index rules' I mean the stipulations that constitute whether or not a given page will be indexed. If you can see them - how?
Intermediate & Advanced SEO | | Visually0 -
What's the "most valuable indirectly related skill" to SEO worth learning?
Hi, All! I have a little time on my hands that's not taken up by client work or our own marketing. What would you say is a skill worth learning during that time? My background is not techie, so while I've picked up a teeny bit of knowledge about code, etc. on the way, I still don't really know how to code, use APIs, etc. So I was thinking something along those lines, but anyone have specific suggestions? And resources for whatever you suggest? Thanks! Aviva
Intermediate & Advanced SEO | | debi_zyx0