Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Removing poor domain authority backlinks worth it?
-
Hey Moz,
I am working with a client on more advanced SEO tactics. This client has a reputable domain authority of 67 and 50,000+ backlinks.
We're wanting to continue SEO efforts and stay on top of any bad backlinks that may arise.
Would it be worth asking websites (below 20 domain authority) to remove our links? Then, use the disavow tool if they do not respond.
Is this a common SEO practice for continued advanced efforts? Also, what would your domain authority benchmark be? I used 20 just as an example.
Thanks so much for your help.
Cole
-
Awesome responses guys. Anyone else have any other insight?
-
I updated my response while you were writing yours.
I don't doubt your insight. But The Googles doesn't sleep.
When you're doing a local campaign, with strictly above the board links, you should move as fast as possible.
-
That would be bad.
You should follow the rough 10-80-10 rule, whether you are building 10 links or 10,000 links. And you should always do it slowly.
I agree there are no specific percentages. You have to look at the big picture over a long period of time.
-
Let's say someone reads this and decides to get their first 10% in the crappy category. That would not be good for them. Further, there aren't any specific percentages that I'm aware of.
Yes, The Googles does have to pick the best of the worst. I'm not in doubt of that.
Yes, sometimes you inherit a mess but it seems to work. Manual reviews happen.
-
Big picture: What a good "problem" to have!
Without taking a close look at your specific URL...
...my first instinct is that the answer to your question is almost certainly a giant...
**No.
DO THE HARD THING: NOTHING!!!!** There is a real danger of overthinking this stuff and neglecting the fundamentals.
I faced the same issue with a DA72 site for a leading SME In his field who had 450,000+ backlinks....some from major media outlets and universities, but most from "nobodies" in the field. This is good!
What you want in a classic Inverted U-shaped curve in terms of DA.
-
10 % crappy links
-
80 % middling links
-
10% super high quality links
You mess with this at your peril!!!! Beware. "Bad" links are not necessarily bad in the grand scheme of universe. Every credible and authoritative site should have some. They are part of a natural link profile.
Getting rid of the <20 DA authority links could hurt...badly.
Focusing excessively on tweaking or sculpting the middling 80% of your links is probably a mistake. You could shoot yourself in the foot.
Less is more.
It might be better to just keep doing what you're doing.
This is hard...and requires great discipline!
-
-
Happy to be contrary. Another good thing about Link Detox is that the service has been trained - mostly for the good - by users manually reviewing the quality of their links. If easylinkseodirectory4u.com has been flagged enough, it's more likely to get caught by the machine.
Once you have uploaded your list and reviewed the links, you will get a pretty accurate risk rating. It scales from shades of low to high. I don't think Link Detox has ever given me a false Toxic rating on individual links either.
I'm not a client scalper, so if you would like to PM the domain name, I can take a look.
-
Excellent, quality response. Thanks so much.
I would love to hear from any disavow experts, maybe even costs of them (of course, I don't want to break any Moz rules that may be applicable).
Cole
-
Setting a DA cut-off from the outset is a bit too arbitrary. What if it's a link from a site with low DA and a low PA now, but later the site becomes the next New York Times? You don't want to disavow the next New York Times, but that's what an arbitrary number would have you do.
Further, DA and PA can be gamed to a certain extent. I'm sure Rap Genius has a pretty solid DA, but they were penalized all the same. So it would appear that using DA as a cut-off would be less than ideal.
There's no real easy way to do a disavow. You have to think about characteristics, context and intent. If you have links that pass juice, but were obviously paid - that may be a candidate. If there's a vast preponderance of links from seemingly low quality directories with exact match anchor text - those would be candidates for closer scrutiny as well. Dead giveaways are usually 'sponsored' links that pass juice.
Low quality directories usually let everyone in. You will know them by their viagra and casino anchor text. They're usually a pretty safe disavow candidate.
Does the site have a lot of links from spam blog comments from sites that are obviously unrelated? Has there been some guest blogging on free for all blogs? Those links would require some review as well.
Definitely prioritize your exact match anchor text links for review.
I would suggest you start with gathering link data from numerous sources:
- Google Webmaster Tools
- Bing Webmaster Tools
- Ahrefs
- Majestic SEO
- Etc.
Then filter the duplicates via spreadsheet voodoo. After that, drop it into a service like Link Detox. But be careful, it still throws false positives and false negatives. So again, there's no real way of getting out of a manual review. But Link Detox will speed up the process.
Are there plenty of disavow services out there? Sure, but I've never used them. I'm far too paranoid. A disavow is a delicate and lengthy process.
Are there some great disavow pros/individuals out there? Definitely. I would be far more likely to trust them. In fact, a couple will likely chime in here. Though they may be a little bit outside the budget. I don't know.
One final, important, point: A disavow is not a panacea. They take as long as they take. Though it is good that you appear to be proactive. You never know when the next Penguin filter will land. The site may be right with The Googles now, but it might not be later.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why has my website been removed from Bing?
I have a website that has recently been removed from Bing's index, but can't figure out why. The website isn't new, and it is indexed just fine on Google. These are the steps I've tried: The website is verified in Bing Webmaster Tools and successfully submitted the sitemap. I tested the URL to ensure that Bingbot is allowed to crawl the site I submitted URLs to Bing via the URL Submission tool There isn't a "noindex" on the site preventing it from being indexed When I do a URL Inspection, an error message comes up saying "The inspected URL is known to Bing but has some issues which are preventing us from serving it to our users. We recommend you to follow Bing Webmaster Guidelines." I contacted Bing to ask whether the website was removed in error, but received a reply that the website doesn't comply with Bing's quality guidelines, but they wouldn't go into detail as to which guidelines the website isn't meeting. The website URL is https://www.pardeehospital.org. Can anyone offer any advice or insight as to why Bing won't index our site? Thank you!
Intermediate & Advanced SEO | | lindsey.steinkamp0 -
Is domain authority lost if you create a 301 redirect but mark it as noindex, nofollow?
Hi everyone, Our company sells products in various divisions. While we've been selling Product A and Product B under our original brand, we've recently created a new division with a new domain to focus on a Product B. The new domain has virtually no domain authority (3) while the original domain has some (37). We want customers to arrive on the new domain when they search for key search terms related to Product B instead of the pages that previously existed on our main website. If we create 301 redirects for the pages and content on the main site and add noindex, nofollow tags, will we lose the domain authority that we have from our original domain because the pages now have the noindex, nofollow tags? I read a few blog posts from Moz that said there isn't any domain authority lost with 301 redirects but I'm not sure if that is true if the pages are noindex, nonofollow. Do you follow? 🙂 Apologies for the lengthy post. Love this community and the great Moz team. Thanks, Joe
Intermediate & Advanced SEO | | jgoehring-troy0 -
6 .htaccess Rewrites: Remove index.html, Remove .html, Force non-www, Force Trailing Slash
i've to give some information about my website Environment 1. i have static webpage in the root. 2. Wordpress installed in sub-dictionary www.domain.com/blog/ 3. I have two .htaccess , one in the root and one in the wordpress
Intermediate & Advanced SEO | | NeatIT
folder. i want to www to non on all URLs Remove index.html from url Remove all .html extension / Re-direct 301 to url
without .html extension Add trailing slash to the static webpages / Re-direct 301 from non-trailing slash Force trailing slash to the Wordpress Webpages / Re-direct 301 from non-trailing slash Some examples domain.tld/index.html >> domain.tld/ domain.tld/file.html >> domain.tld/file/ domain.tld/file.html/ >> domain.tld/file/ domain.tld/wordpress/post-name >> domain.tld/wordpress/post-name/ My code in ROOT htaccess is <ifmodule mod_rewrite.c="">Options +FollowSymLinks -MultiViews RewriteEngine On
RewriteBase / #removing trailing slash
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)/$ $1 [R=301,L] #www to non
RewriteCond %{HTTP_HOST} ^www.(([a-z0-9_]+.)?domain.com)$ [NC]
RewriteRule .? http://%1%{REQUEST_URI} [R=301,L] #html
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^.]+)$ $1.html [NC,L] #index redirect
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.html\ HTTP/
RewriteRule ^index.html$ http://domain.com/ [R=301,L]
RewriteCond %{THE_REQUEST} .html
RewriteRule ^(.*).html$ /$1 [R=301,L]</ifmodule> The above code do 1. redirect www to non-www
2. Remove trailing slash at the end (if exists)
3. Remove index.html
4. Remove all .html
5. Redirect 301 to filename but doesn't add trailing slash at the end0 -
Removing UpperCase URLs from Indexing
This search - site:www.qjamba.com/online-savings/automotix gives me this result from Google: Automotix online coupons and shopping - Qjamba
Intermediate & Advanced SEO | | friendoffood
https://www.qjamba.com/online-savings/automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. and Google tells me there is another one, which is 'very simliar'. When I click to see it I get: Automotix online coupons and shopping - Qjamba
https://www.qjamba.com/online-savings/Automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended. I assume that having 2 indexed urls for the same content dilutes link juice. Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls? And if, so what is the best way -- there are thousands.0 -
Domain Authority: 23, Page Authority: 33, Can My Site Still Rank?
Greetings: Our New York City commercial real estate site is www.nyc-officespace-leader.com. Key MOZ metric are as follows: Domain Authority: 23
Intermediate & Advanced SEO | | Kingalan1
Page Authority: 33
28 Root Domains linking to the site
179 Total Links. In the last six months domain authority, page authority, domains linking to the site have declined. We have focused on removing duplicate content and low quality links which may have had a negative impact on the above metrics. Our ranking has dropped greatly in the last two months. Could it be due to the above metrics? These numbers seem pretty bad. How can I reverse without engaging in any black hat behavior that could work against me in the future? Ideas?
Thanks, Alan Rosinsky0 -
SEO value in multiple backlinks from same domain and from various sub-domains.
A site has a link to my site as one of their main tabs, which means whenever a user clicks through to another page within the site, my link - being a main tab - is there. This creates thousands of links from this site. How does Google treat this? Do we have a rough formula estimate. In other words, assume it creates 1,000 backlinks would the SEO value be around the same as if I had just 2 link total as a main tab, but on 2 different non-related sites? Or, does it actually count fully as 1,000 links? Links from various sub-domains. Several .EDU's are linking to my site. Different schools within the overall same university. Example: nursing.abc.edu links to my site, but so does business.abc.edu. For SEO does that count as much as if I had links from complete non-related universities, or would Google evaluate that these links are related (since same main domain) and that will discount any links more than 1 to some extent? If discounted, then what do we estimate the discount to be? thank yoyu
Intermediate & Advanced SEO | | knielsen1 -
Should I buy a .co domain if my preferred .com and .co.uk domain are taken by other companies?
I'm looking to boost my website ranking and drive more traffic to it using a keyword rich domain name. I want to have my nearest city followed by the keyword "seo" in the domain name but the .co.uk and .com have already been taken. Should I take the plunge and buy .co at a higher price? What options do I have? Also whilst we're on domains and URL's is it best to separate keywords in url's with a (_) or a (-)? Many thanks for any help with this matter. Alex
Intermediate & Advanced SEO | | SeoSheikh0 -
Hosting images on multiple domains
I'm taking the following from http://developer.yahoo.com/performance/rules.html "Splitting components allows you to maximize parallel downloads. Make sure you're using not more than 2-4 domains because of the DNS lookup penalty. For example, you can host your HTML and dynamic content on www.example.org and split static components between static1.example.org and static2.example.org" What I want to do is load page images (it's an eCommerce site) from multiple sub domains to reduce load times. I'm assuming that this is perfectly OK to do - I cannot think of any reason that this wouldn't be a good tactic to go with. Does anyone know of (or can think of) a reason why taking this approach could be in any way detrimental. Cheers mozzers.
Intermediate & Advanced SEO | | eventurerob0