Is Anchor text metric is available for free MOZ API ?
-
I'm accessing anchor text metric using free MOZ API account which is for 30 days. is there is any way to get anchor text ?
-
Hey vijay,
Got it.
Thanks!
-
Hi Roger,
As explained by Eli, the Moz Pro is a different product from MOZ api and these two are not to be confused with each other.
I hope this helps.
Regards,
Vijay
-
I understand.
The initial confusion is because you are referring to our Moz Pro plan (as you mentioned a 30 day trial). We do not offer a 30 day free trial of our API, only for our Moz Pro plans.
All Moz community users have access to our FREE limited API access (one request every 10 seconds). Unfortunately a Moz Pro subscription does not increase this limit.
However, we do offer paid API plans ranging from $250 a month to $10,000 a month. All the plans come with only a call rate limit of 200 requests per second, but with varying 'rows per month'.
Check out the following link with more information: https://moz.com/products/api/pricing
Please refer to this link for all the metrics we provide: https://moz.com/help/guides/moz-api/mozscape/api-reference/url-metrics (They are the same, regardless of the plan you are on).
I hope this helps,
Eli
-
Hi Eli!
I was not asking about MOZ pro accounts,i'm more concern about MOZ API available metric whether it is related to entry level accounts or MOZ API free but limited account. Does MOZ is providing following metric on entry level API
1-Anchor text
2-Spam score
3-Total discovered and Total Lost links
4-Page Authority
5-Domain Authority
6-Total active and In-active links.
7-Link Type and Link State
Thanks.
-
Hey there!
Thanks for reaching out to us!
I'm sorry, there may be a bit of confusion here. Our API tool is a separate product to Moz Pro.
We offer a permanently free, limited version of our API - https://moz.com/products/api/pricing and we also offer a 30 day free trial of Moz Pro (two separate tools).
Check out the metrics you can get with our API here: https://moz.com/help/guides/moz-api/mozscape/api-reference/url-metrics (We do include 'Anchor Text Metrics' within our API).
Let me know if you have any other questions and I'll be more than happy to help! Feel free to reach out to help@moz.com if you would like us to investigate into your specific account.
Have a great day!
Eli
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Frequency of Moz page authority updates?
I have some new pages on my site, and Moz gives them a very low PA ranking. I am wondering if these scores are updated monthly or quarterly? I'm not sure how frequently to check back for updated scoring.
API | | AndrewMicek0 -
Is everybody seeing DA/PA-drops after last MOZ-api update?
Hi all, I was wondering what happend at the last MOZ-update. The MOZ-health page has no current errors and i've checked a lot of websites probably about 50 a 60 from our customers and from our competitiors and everybody seems to have a decrease in DA's / PA's. Not just a couple went up and down like normally would happen, but all seems to have dropped.... Is anyone seeing the same in their campaigns? Greetings,
API | | NielsVos
Niels3 -
Can the API Filter Links with Certain Anchor Text?
I am trying to get all links that have a certain strings in their anchor text: I am using the python library: https://github.com/seomoz/SEOmozAPISamples/blob/master/python/lsapi.py Looking at the documentation, it says I can get the normalized anchor text by using the bit flag 8 for the LinkCols value: https://moz.com/help/guides/moz-api/mozscape/api-reference/link-metrics So I tried this: links = l.links('example.com', scope='page_to_domain', sort='domain_authority', filters=['external'], sourceCols = lsapi.UMCols.url, linkCols=8) But it doesn't return the expected 'lnt' response field or anything similar to the anchor text. How do I get the anchor text on the source URLs? I also tried 10 for the linkCols value, to get all the bit flags in the lf field as well as the anchor text. In both instances (and even with different variations of targetCols & sourceCols), this is all the fields that are returned: 'lrid', 'lsrc', 'luuu', 'uu', 'luupa', 'ltgt'
API | | nbyloff0 -
Keywords API
Does MOZ offer any statistics/data about keywords through the API? I didn't see anything in the documentation... Thanks! Justin
API | | Campaignium0 -
Lost many links and keyword ranks since moz index update
Hi All, I came back from work today from a week off to find my site has gone from 681 external inbound links to 202. With this my domain authority, moz trust and moz rank have all also taken a slip. Compounding this, I am seeing a slip most of my keywords rankings. If i try to use the open site explorer to explore my links and see what going on i get the message It looks like we haven't discovered link data for this site or URL. If i check the just discovered links like it suggests I get It looks like there's no Just-Discovered Links data for this URL yet. I know these features worked before the index as i used them. Is this all attributable to the moz index issues that have been noted or could something have happened to my site? Since i started 2 months ago I have made many changes including... Updating the site map that was 4 years out of date and included 400 broken urls Removed blank pages and other useless webpages on the site that contained no content (from the previous administrator) Edited a few pages content from keyword spammy stuff to nicely written and relevant content Fixed url rewrites that made loops and un-accessible product pages All these changes should be for the better but the latest readings have me a little worried. Thanks.
API | | ATP0 -
Pulling large amounts of data from moz api
Hi i'm looking to pull large amounts of data from the moz and semrush api. I have been using seotools addon for excel to extract data but excel is slow, sometimes crashes and not very reliable. Can anyone recommend any other tools i can use, to pull huge amounts of data? Any suggestions would be highly appreciated! Cheers, RM
API | | MBASydney0 -
Integrate Piwik data with Moz instead of Google Analytics?
Hello everyone, The company I work for had Google Analytics for all of our websites, dating back at least 4 years. Because of our email infrastructure, we accessed Analytics through a temporary account. Now this account worked fine for 4 years, until over the last two weeks where at some point Google deleted that account, and the Analytics connected to it. I've had no response from Google regarding this, but that's by the by. Anyway, I've made the decision to bring the analytics in house and hold the data ourselves. I've chosen Piwik for this, and it's been running fine for the last few days. However, I know Moz integrates with Google Analytics, but at the moment this is useless to me. Is there a way, or are there any plans, to integrate Piwik with Moz? If not, what benefits will I lose if I discontinue using Google Analytics with Moz?
API | | theponyclub0 -
Suggestion - How to improve OSE metrics for DA & PA
I am sure everyone is aware at Moz, that although the Moz link metrics ( primarily I am talking about DA & PA) are good, there is a lot of room for improvement, and that there are a lot of areas where the metric values given to some types of site are well out of whack with what their "real" values should be. Some examples
API | | James77
www.somuch.com (Link Directory) - DA 72
www.articlesbase.com (Article Directory) - DA 89
www.ezinearticles.com (Article Directory) - DA 91 I'm sure everyone would agree that links from these domains are not as powerful (if of any value at all), as their DA would suggest, and therefore by definition of how moz metrics work, the sites these have links from such sites are also inflated - thus they throw the whole link graph out of whack. I have 2 suggestions which could be used to singularly or in conjunction (and obviously with other factors that Moz use to calculate DA and PA) which could help move these values to what they should more realistically be. 1/. Incorporate rank values.
This is effectively using rank values to reverse engine what google (or other engines) as a "value" on a website. This could be achieved (if moz were not to build the data gathering system itself), by intergrating with a company that already provides this data - eg searchmetrics, semrush etc. As an example you would take a domian and pull in some rank values eg http://www.semrush.com/info/somuch.com?db=us - where you could use traffic, traffic price, traffic history as a metric as part of the overall Moz scoring alogrithm. As you can see from my example according to SEMRush the amount of traffic and traffic price is extreamly low for what you would expect of a website that has a DA of 72. Likewise you will find this for the other two sites and similarly to pretty much any other site you will test. This is essentially because your tapping into Googles own ranking factors, and thereby more inline with what real values (according to Google) are with respect to the quality of a website. Therefore if you were to incorporate these values, I believe you could improve the Moz metrics. 2/. Social Sharing Value
Another strong indicator of quality the amount of social sharing of a document or website as a whole, and again you will find as with my examples, that pages on these sites have low social metrics in comparison to what you would normally associate with sites of these DA values. Obviously to do this you would need to pull social metrics of all the pages in your link DB. Or if this we to tech intense to achieve, again work with a partner such as searchmetrics, which provide "Total Social Interations" on a domain level basis. Divide this value by the number of Moz crawled pages and you would have a crude value of the overall average social scorability of a webpage on a given site. Obviously both the above, do have their flaws if you looked at them in complete isolation, however in combination they could provide a robust metric to use in any alogrithm, and in combination with current moz values used in the alogrithm I believe you could make big strides into improving overall Moz metrics.1