Suggestion - How to improve OSE metrics for DA & PA
-
I am sure everyone is aware at Moz, that although the Moz link metrics ( primarily I am talking about DA & PA) are good, there is a lot of room for improvement, and that there are a lot of areas where the metric values given to some types of site are well out of whack with what their "real" values should be.
Some examples
www.somuch.com (Link Directory) - DA 72
www.articlesbase.com (Article Directory) - DA 89
www.ezinearticles.com (Article Directory) - DA 91I'm sure everyone would agree that links from these domains are not as powerful (if of any value at all), as their DA would suggest, and therefore by definition of how moz metrics work, the sites these have links from such sites are also inflated - thus they throw the whole link graph out of whack.
I have 2 suggestions which could be used to singularly or in conjunction (and obviously with other factors that Moz use to calculate DA and PA) which could help move these values to what they should more realistically be.
1/. Incorporate rank values.
This is effectively using rank values to reverse engine what google (or other engines) as a "value" on a website. This could be achieved (if moz were not to build the data gathering system itself), by intergrating with a company that already provides this data - eg searchmetrics, semrush etc. As an example you would take a domian and pull in some rank values eg http://www.semrush.com/info/somuch.com?db=us - where you could use traffic, traffic price, traffic history as a metric as part of the overall Moz scoring alogrithm. As you can see from my example according to SEMRush the amount of traffic and traffic price is extreamly low for what you would expect of a website that has a DA of 72. Likewise you will find this for the other two sites and similarly to pretty much any other site you will test. This is essentially because your tapping into Googles own ranking factors, and thereby more inline with what real values (according to Google) are with respect to the quality of a website. Therefore if you were to incorporate these values, I believe you could improve the Moz metrics.2/. Social Sharing Value
Another strong indicator of quality the amount of social sharing of a document or website as a whole, and again you will find as with my examples, that pages on these sites have low social metrics in comparison to what you would normally associate with sites of these DA values. Obviously to do this you would need to pull social metrics of all the pages in your link DB. Or if this we to tech intense to achieve, again work with a partner such as searchmetrics, which provide "Total Social Interations" on a domain level basis. Divide this value by the number of Moz crawled pages and you would have a crude value of the overall average social scorability of a webpage on a given site.Obviously both the above, do have their flaws if you looked at them in complete isolation, however in combination they could provide a robust metric to use in any alogrithm, and in combination with current moz values used in the alogrithm I believe you could make big strides into improving overall Moz metrics.
-
I'm not directly involved in the project, but I think that's actually part of what they're doing - using Google de-indexation and obvious penalties to train the system, but trying to avoid a system that would have to go look up the site on Google every time it needed to make a prediction.
-
Cheers Pete.
I totally understand the data dependency. One thing you could do, which would not require data dependency (long term), and also help with the spam detection your building is to take a single snapshot of "Ranking" - then use this as a data set to pattern match spam sites. EG if you managed to pull say 100,000's of ranking scores (say traffic scores from SEMRush), then match that with Moz's current scoring on that domain, then bucket the sites into groups that have higher or lower ranking scores than DA would predict, then try and reverse engineer the link or other patterns Moz use which are common to those buckets.
-
Thanks - happy to pass that along. We're actually in the middle of a long-term spam detection project to help notify people when a site seems to be suspicious or is likely to be penalized by Google. Eventually, this may find its way into DA/PA. We don't want to use ranking and Google's own numbers, as it creates a bit of a problematic data dependency for us (especially long-term).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What does "ffspl" mean in a url-metrics result set?
So I've got this server-side JScript object I've declared which works nicely for us. var mozObj = new MozInteraction()
API | | StevePoul
.setMethod("GET")
.setHost("http://lsapi.seomoz.com")
.setPath("linkscape/url-metrics")
.setCols(parseInt("11111111111111111111111111111111111111111111111111111", 2))
.setLines(10)
.setAccessId(accessId)
.setSecret(secret)
.setExpires((new Date()).addDays(1).valueOf())
.generateSignature(); etc etc The setCols call sets the Cols value to the maximum possible: 9007199254740991. The weird thing is that in the JSON response I get a whole pile of column names that I can't find descriptions for in the documentation at https://moz.com/help/links-api/making-calls/response-fields e.g. ffspl1 ffspl2 ffspl3 ... fuspl0 fuspl1 fuspl2 fuspl3 ... pfspl0 pfspl1 pfspl2 pfspl3 ... puspl0 puspl1 puspl2 puspl3... etc1 -
I'm only seeing 3 results for DA/PA, is this a limitation?
I'm checking the DA/PA of the top 10 urls on google for a keyword, but I only get the DA/PA results for 3 of the domains, I know the other domains in the list would have results, is this a limitation on the API or should I be able to get DA/PA for more than 3 domains at a time? would it help if I only check one at a time instead of a larger set? 5dpi0h.png
API | | infernodia0 -
Why does OSE show old data (Previous update results)?
Moz api started to show July 13 update results for my website. I checked it 2 days ago and saw all new established links and updated DA PA for July 27 update. But last 2 days both Moz Api and OSE main page show July 13 update results. Is there a maintenance or mismatching error between old and new databases?
API | | cozmic0 -
How do batched URL metrics work in terms of rows and rate limit?
I am using the free API plan to get URL metrics and batching my calls like this: https://github.com/seomoz/SEOmozAPISamples/blob/master/php/batching_urls_sample.php How does this work in terms of rows and limits? If I do a batch of 10 urls does it count as 1 row? or 10? Do I have to wait 10 seconds before calling the next batch?
API | | MWS20 -
Why did the April Index Raise DA?
All of our websites DA raised dramatically, including the competitors we track Any idea why this may have happened across the board?
API | | Blue_Compass0 -
Is Url-Metrics Historical data available via API?
Hi, we're using Url Metrics API. We were wondering if we could access historical data. Like for example passing the date to query. Currently on the api docs this ability is not listed. Thanks!
API | | Haystak0 -
How much attention should I pay to Moz's DA/PA?
Hola! I've been optimising a site since October and our hard work has yielded a sizeable increase in organic traffic, revenue, quality, relevant links and Search Metrics scoring since commencing the campaign. After yesterday's Moz update, the DA has dropped slightly and a number of pages' PAs have dropped significantly (i.e. from 27 to 17). So here are my questions: My 'white hat' optimisation is clearly working. The site is enjoying more than 100% year-on-year increase in organic traffic and we're currently pulling in more organic visitors than ever before. Why is Moz's score not reflecting this? Some of the pages that have seen sizeable PA drops have had their URLs changed since the last Moz update. For example, I've optimised a URL from www.mysite.com/cases-covers to www.mysite.com/phone-cases to coincide with search volumes. I've added optimised content to this page too, but the PA has dipped from 27 to 17. A 301 redirect has been correctly added, and this is evident by a PA of 17 and not zero, which is what a brand new page would have. Am I paying too much attention to Moz's scores? It's a bit disheartening to see a drop after a lot of hard work. However, I guess the only thing that really counts is an increased volume of search traffic and revenue, right? Cheers, Lewis
API | | PeaSoupDigital0 -
Any idea why HubSpot (which uses the MOZ api) can find our backlinks but OSE can not?
Links are showing up in the HubSpot link tool within a couple of days. OSE has our site showing zero backlinks. ahrefs is finding them too but that makes more sense. Does the API update more frequently than OSE? Any thoughts? Thanks!
API | | KLEANTreatmentCenter0