DA vs Relevancy - Trade Off Question
-
Hey Guys
We all know that relevancy largely trumps DA nowadays.
What I am wondering is if there is a DA 'level' at which relevancy doesn't really matter - you probably still want a backlink from that site...
For example, sites with DA of 100 we probably want backlinks from.
So where do you draw the line? What I mean is for a high DA 'non relevant' site, what DA is 'acceptable' where you start to disregard relevancy? I'm thinking something like 70 and above would like some other thoughts...
Obviously you would still be building relevant links too, developing content to do so and all that good stuff. I am just wondering what DA I should focus on for building non-relevant links ALONGSIDE relevant links
Thanks
-
I submitted a request through The Guild to find out why my original answer above was not delivered to you. Sorry about that.
As to your question about the average Moz Domain Authority. DA is made up of "MozRank", "MozTrust" and your link profile. Of those three, the one that we can most quantify is "MozRank." Moz actually says the average MozRank (from 1-10) for a site page is a "3."
So just taking that information it's easier to see why the average Moz Authority would be on the lower end 30-40 and not 50 or higher. It also explains why it is easier to move from 30 to 40 than it is to move from 70-80.
Good luck with the above, let me know if I can provide any other help.
-
Weird, I didn't get anything over there, just checked and it's still not on the thread or anywhere I can see...
I posted it on a few forums as I wanted to see what the response was like as a sort of 'test' as to which forum I should spend time on
Anyway, thanks for the response. I was not aware so many sites below 40 existed - I really had no basis point for what was an above average DA (I would naturally have assumed above 50!).
I will check out your articles - thanks again!
-
Hi Michael,
You submitted this same exact question over at The Guild (SEO Chat). Did you receive that answer?
Here's that answer again, including links to the subscriber-only materials you have access to you. Let me know if you can't access anything or if you have any clarification questions:
First, if you haven't seen our article this month on Domain Authority I'd like you to review that first here:
https://guild.seochat.com/se-news/content/looking-to-boost-your-domain-authority-here-are-some-tips
As for your question below, honestly, there is no right answer. Speaking as a link builder myself I'm ALWAYS going to choose a lower DA value site link target that is niche-related over a higher DA site that is not. That's a qualified linking target. Just as I would always choose a link on a page that is actually going to get clicked over a link on a higher DA page that sends me NO traffic.
So there is really not a quantifiable answer to your question. If you want to use DA 70 as your make or break it number, awesome. But considering that the VAST majority of sites have a DA score of 40 or less, personally, 70 seems awfully high to use as a preconceived standard for evaluation.
Make sure to read my detailed article on link evaluation here:
When I'm evaluating a link I'll honestly not review DA till the end. And even then, again, if a link isn't going to get clicked or send traffic, to me that is a HUGE deal and I devalue the link immediately.
Hope that helps. Thanks for the question and definitely respond (other here, or at The Guild) for more information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I increase my website DA and PA?
I want to launch a new site like www.plaza.ir. How can I improve its Da and PA quickly ?
Intermediate & Advanced SEO | | arpaymantul0 -
Figuring out the topic relevance
Hello everyone, How can you figure out which topics are the more relevant for a specific query ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
How to outrank a directory listing with high DA but low PA?
My site is at 4th place, 3 places above it is a gumtree (similar to yell, yelp) listing. How can you figure out how difficult it would be outrank those pages? I mean obviously the pages would have low PA and they are top based on the high DA of the site. This also seems to go back to keyword research and difficulty, when I'm doing keyword research and I see a wikipedia site in top 5 rank, or a yell.com or perhaps an article in forbes.com outranks your site. Typically the problem seems to be Google giving a lot of credit to these pages rankings based on the high DA rather than PA of the pages. How would you gauge the difficulty of that keyword then if the competition are pages with very high DA which is impossible to compete with but low PA? Thanks
Intermediate & Advanced SEO | | magusara2 -
Https vs Http Link Equity
Hi Guys, So basically have a site which has both HTTPs and HTTP versions of each page. We want to consolidate them due to potential duplicate content issues with the search engines. Most of the HTTP pages naturally have most of the links and more authority then the HTTPs pages since they have been around longer. E.g. the normal http hompage has 50 linking root domains while the https version has 5. So we are a bit concerned of adding a rel canonical tag & telling the search engines that the preferred page is the https page not the http page (where most of the link equity and social signals are). Could there potentially be a ranking loss if we do this, what would be best practice in this case? Thanks, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Questions About Link Detox
Greetings: In April of 2014 an SEO firm ran a link removal campaign (identified spammy links and uploaded a disavow). The overall campaign was ineffective and MOZ domain rank has fallen to 24 from about 30 in the last year and traffic is 20% lower. I purchased a basic package for Link Detox and ran a report today (see enclosed) to see if toxic links could be contributing to our mediocre rankings. As a novice I have a few questions for you regarding this the use of Link Detox: -We scored a domain wide detox risk of 1,723. The site has referring root domains with 7113 links to our site. 121 links were classified as high audit priority. 56 as medium audit priority. 221 links were previously disavowed and we uploaded a spreadsheet containing the names of the previously disavowed links. We had LinkDetox include an analysis of no-follow links as they recommend this. Is our score really bad? If we remove the questionable links should we see some benefit in ranking? -Some of the links we disavowed last year are still linking to our site. Is it worthwhile to include those links again in our new disavow file? -Prior to filing a disavow we will request that Webmaster remove offending links. LinkDetox offers a package called Superhero for $469.00 that automates the process. Does this package effectively help with the entire process of writing and tracking the removal requests? Do you know of any other good alternatives? -A feature called "Boost" is included in the LinkDetox Super Hero package. It is suppose to expedite Google's processing of the disavow file. I was told by the staff at Link Detox that with Boost Google will process the disavow within a week. Do you have any idea if this claim is valid??? It would be great if it were true. -We never experienced any manual penalty from Google. Will uploading a disavow help us under the circumstances? Thanks for your feedback, I really appreciate it!!! Alan p2S6H7l
Intermediate & Advanced SEO | | Kingalan10 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Few questions regarding wordpress and indexing/no follow.
I'm using Yoast's Wordpress SEO plugin on my wordpress site which allows you to quickly set up nofollow / no index on specific taxonomies. I wanted to see what you guys thought was the best practice in setting up my various taxonomies. Would you noidex, but follow all of these, none of these, or just some of these: Categories, tags, media, author archives ( (My blog is mainly a single author blog (me) but my wife does sometimes write posts. So I didn't know how this effected everything. Also I could simply make the blog a single user blog and just have her posts be guest posts, but I'd rather leave her as a user.), and date archives. The example I read on line only no-index's the date archives. Just curious what you guys thought. Thanks.
Intermediate & Advanced SEO | | NoahsDad0 -
Very basic - domain authority vs page authority
what does that mean and how is that information valuable? thank you
Intermediate & Advanced SEO | | thirsty31