Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Backlink quality vs quantity: Should I keep spammy backlinks?
-
Regarding backlinks, I'm wondering which is more advantageous for domain authority and Google reputation:
- Option 1: More backlinks including a lot of spammy links
- Option 2: Fewer backlinks but only reliable, non-spam links
I've researched this topic around the web a bit and understand that the answer is somewhere in the middle, but given my site's specific backlink volume, the answer might lean one way or the other.
For context, my site has a spam score of 2%, and when I did a quick backlink audit, roughly 20% are ones I want to disavow. However, I don't want to eliminate so many backlinks that my DA goes down. As always, we are working to build quality backlinks, but I'm interested in whether eliminating 20% of backlinks will hurt my DA.
Thank you!
-
Backlinks are always about quality not quantity. Google does not like too many backlinks and especially spammy backlinks. I would suggest you to go with quality backlinks if you want long term and sustainable results otherwise there will always be a threat of getting penalized by google if you focus on spammy backlinks.
-
It's a myth that your DA drops because you put links in disavow. Disavow is a google only (or bing) tool, where lets say you get spammy links from a rogue domain and there's no way you can get 'm removed.
MOZ cant read your disavow file either you file into google. So i'm not sure on how the link is being put here. With MOZ, or any other tool, they just calculate the amount of incoming, FOLLOW links and presume your DA on some magical number. Thats all there is to it. Again, PA/DA has nothing in common at all with Google as Google maintains their own algorithm.
-
Hello again,
Thanks for the clarification and the link. I've read through that and a few other sources across the web, but none of them seemed to answer my question the way you did, so thanks! Our backlink profile is pretty balanced with spammy and definitely not spammy, so I'm not super concerned about it, but I appreciate the reminder.
-
I should also clarify, these may hurt you if they are your only links. If you have very little equitable links, this may cause Google and other search engines to falsely recognize you as spam. So just be careful and be on the look out for extra suspicious spam links. The balanced approach is the best approach: don't worry but stay aware!
Here is a more technical write-up from Moz that I reccomend: https://moz.com/help/link-explorer/link-building/spam-score
-
No problem Liana.
- That is correct. Google understands that you don't have control of 3rd party sites, so instead of penalizing you, they minimize/ delete the effect the spam site links have.
- Yes, but only kind of. It may or may not increase PA/ DA, but according to Google it shouldn't hurt you.
But yeah that's the gist of it! Instead taking the time investigating and disavowing links, you could spend that time cultivating relationships with other websites and businesses that could give you nice quality linkage.
Hope this answer works for you.
-
Hello Advanced Air Ambulance SEO!
Thanks for the quick and thorough response. Please confirm if I understand you correctly:
- I can leave spammy backlinks alone (not spend time disavowing them) _unless _I see a manual action in Search Console, which would indicate that Google sees an issue and is penalizing my site until I disavow the links. Without this manual action, there's no indication that the spam links are hurting my rankings or DA.
- Leaving spammy backlinks that don't incur a manual action may actually increase DA since leaving them maintains a higher volume of backlinks (albeit some spammy), and backlink quantity is a contributor to DA.
Thank you!
-
Hi Liana,
As far as spammy links, Google has done well detecting whether or not they are intentional, aka black hat. If they aren't, Google does not penalize you for these links, so it's best to leave them.
As far as a strategy for generating links to your website, you should always focus on high quality over quantity. High quality links give you exponentially more return than high quantity of bad links.
I recommend this article Google wrote for us to understand when and how to disavow links.
https://support.google.com/webmasters/answer/2648487?hl=en
In short, rarely do you ever need to disavow links, even if they have a high spam score. You are only hurt when they sense you are gaming the system and in the case that they detect or suspect unethical backlinking, you will be penalized with a "manual action". You can check if you were penalized, as well as disavow flagged backlinks, in the Google Search Console.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
422 vs 404 Status Codes
We work with an automotive industry platform provider and whenever a vehicle is removed from inventory, a 404 error is returned. Being that inventory moves so quickly, we have a host of 404 errors in search console. The fix that the platform provider proposed was to return a 422 status code vs a 404. I'm not familiar with how a 422 may impact our optimization efforts. Is this a good approach, since there is no scalable way to 301 redirect all of those dead inventory pages.
Technical SEO | Jul 26, 2016, 6:15 PM | AfroSEO0 -
Backlinks that we have if they are 404?
Hi All, Backlinks that we have if they are 404? Open site explorer shows 1,000 of links and when I check many are 404 and those are spammy links which we had but now the sites are 404 I am doing a link profile check which is cleaning up all spammy links Should i take any action on them? As open site explorer or Google still shows these links on the searches. Should we mention these URL's in disallow in Google webmaster. Thanks
Technical SEO | Jul 28, 2013, 11:57 PM | mtthompsons0 -
Multilingual Website - Sub-domain VS Sub-directory
Hi Folks - Need your advice on the pros and cons of going with a sub-domain vs a sub-directory approach for a multi lingual website. The best would be a ccTLD but that is not possible now, so I would be more interested in knowing your take on these 2 options. Though, I have gone through http://www.stateofsearch.com/international-multilingual-sites-criteria-to-establish-seo-friendly-structure/ and this somewhat vouches for a sub-directory, but what would you say'?
Technical SEO | May 10, 2013, 1:19 PM | RanjeetP0 -
Root directory vs. subdirectories
Hello. How much more important does Google consider pages in the root directory relative to pages in a subdirectory? Is it best to keep the most important pages of a site in the root directory? Thanks!
Technical SEO | Mar 5, 2013, 4:12 AM | nyc-seo0 -
Noindex vs. page removal - Panda recovery
I'm wondering whether there is a consensus within the SEO community as to whether noindexing pages vs. actually removing pages is different from Google Pandas perspective?Does noindexing pages have less value when removing poor quality content than physically removing ie. either 301ing or 404ing the page being removed and removing the links to it from the site? I presume that removing pages has a positive impact on the amount of link juice that gets to some of the remaining pages deeper into the site, but I also presume this doesn't have any direct impact on the Panda algorithm? Thanks very much in advance for your thoughts, and corrections on my assumptions 🙂
Technical SEO | Feb 1, 2013, 4:54 PM | agencycentral0 -
Rel=Canonical, WWW vs non WWW and SEO
Okay so I'm a bit of a loss here. For what ever reason just about every single Wordpress site I has will turn www.mysite.com into mysite.com in the browser bar. I assume this is the rel=canonical tag at work, there are no 301s on my site. When I use the Open Site Explorer and type in www.mysite.com it shows a domain authority of around 40 and a few hundred backlinks... and then I get the message. Oh Hey! It looks like that URL redirects to XXXXXX. Would you like to see data for <a class="clickable redirects">that URL instead</a>? So if I click to see this data instead I have less than half of that domain authority and about 2 backlinks. *** Does this make a difference SEO wise? Should my non WWW be redirecting to my WWW instead because that's where the domain authority and backlinks are? Why am I getting two different domain authority and backlink counts if they are essentially the same? Or am I wrong and all that link juice and authority passes just the same?
Technical SEO | Jun 22, 2012, 8:46 AM | twilightofidols0 -
Internal search : rel=canonical vs noindex vs robots.txt
Hi everyone, I have a website with a lot of internal search results pages indexed. I'm not asking if they should be indexed or not, I know they should not according to Google's guidelines. And they make a bunch of duplicated pages so I want to solve this problem. The thing is, if I noindex them, the site is gonna lose a non-negligible chunk of traffic : nearly 13% according to google analytics !!! I thought of blocking them in robots.txt. This solution would not keep them out of the index. But the pages appearing in GG SERPS would then look empty (no title, no description), thus their CTR would plummet and I would lose a bit of traffic too... The last idea I had was to use a rel=canonical tag pointing to the original search page (that is empty, without results), but it would probably have the same effect as noindexing them, wouldn't it ? (never tried so I'm not sure of this) Of course I did some research on the subject, but each of my finding recommanded one of the 3 methods only ! One even recommanded noindex+robots.txt block which is stupid because the noindex would then be useless... Is there somebody who can tell me which option is the best to keep this traffic ? Thanks a million
Technical SEO | Apr 13, 2012, 7:13 PM | JohannCR0 -
Internal vs external blog and best way to set up
I have a client that has two domians registered - one uses www.keywordaustralia.com the other uses www.keywordaelaide.com He had already bought and used the first domain when he came to me I suggested the second as being worth buying as going for a more local keyword would be more appropriate. Now I have suggested to him that a blog would be a worthy use of the second domain and a way to build links to his site - however I am reading that as all links will be from the same site it wont be worth much in the long run and an internal blog is better as it means updated content on his site. should i use the second domain for blog, or just 301 the second domain to his first domain. Or is it viable to use the second domain as the blog and just set up an rss feed on his page ? Is there a way to have the second domain somehow 'linked' to his first domain with the blog so that google sees them as connected ? NOOBIE o_0
Technical SEO | Oct 4, 2011, 6:42 PM | mamacassi0