Mozcape API Batching URLs LIMIT
-
Guys, there's an example to batching URLs using PHP:
Which is the maximum number of URLs I can add to that batch?
-
Yes, it's weird. I currently have the Pro plan. I'm doing queries using 200 URLs at the same time with no issues ;). The only limitation is the time: I have to make a query and wait for 10 seconds to perform another
This request exceeds the limit allowed by your current plan.
Thank you Zach, have a good day!
-
As far as I know the limit was 10. That article on the APIWiki says the same. I do know, as a premium subscriber, that the # of batch requests per second is 200, however.
Quote from the API Wiki:
"You can submit up to 10 URLs for every batch request. Larger batch requests will return an HTTP 400 response."
I'd just be careful, because if your not getting a 400 response, they may end up throttling you.
Hope this helps
Zach -
Thanks Zachary. I made a test adding a lot of URLs. SEOmoz says the limit is 200 or less URLs at the same time. So, what I have to use... the 10 URLs limit or the 200?
Currently Im able to get data of 200 URLs at the same time, that's great for me!
-
SEOmoz recommends batch requests of 10 URLs according to their API wiki http://apiwiki.seomoz.org/url-metrics it states that any batch request larger than this will output a 400 error from the server.
Hope that helps!
Zach
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to treat URLs ending in /?s=
Hi community, I'm going through the list of crawl errors visible in my MOZ dashboard and there's a few URLs ending in /?s= How should I treat these URLs? Redirects? Thanks for any help
Moz Pro | | Easigrass0 -
404 error for unknown URL that Moz is finding in our blog
I'm receiving 404 errors on my site crawl for messinastaffing.com. They seem to be generating only from our blog posts which sit on Hubspot. I've searched high and low and can't identify why our site URL is being added at the end - I've tried every link in our blog and cannot repeat the error the crawl is finding. For instance: Referer is: http://blog.messinastaffing.com/take-charge-career-story-compelling-cover-letter/ 404 error is: http://blog.messinastaffing.com/take-charge-career-story-compelling-cover-letter/www.messinastaffing.com I agree that the 404 error URL doesn't exist but I can't identify where Moz is finding it. I have approximately 75 of these errors - one for every blog on our site. Beth Morley Vice President, Operations Messina Group Staffing Solutions
Moz Pro | | MessinaGroup
(847) 692-0613 www.messinastaffing.com0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
SEOMoz API not working for Scrapebox
I want to import SEOMoz data to list of URLs I have using scrapbox. I added in my credentials according to the API but am getting error 401 as the status of all my links. Any idea why and what I should be doing?
Moz Pro | | theLotter0 -
Why does it keep displaying br tags and claiming 404 errors on like 4 of my URL's for all my Wordpress sites?
Is anyone else having the same issue? These errors don't actually exist and i think it has something to do with wordpress - how can i fix this?
Moz Pro | | MillerPR0 -
Looking For URL Anchor Text Metrics Definitions
Running some keyword difficulty reports that are showing some interesting data around URL Anchor Text Metrics. But ti fully understand them, I need some definitions, which I cannot find anyone. So can someone point me to definitions of these terms: Exact Anchor Text Links % Links w/ Exact Anchor Text Linking Root Domains w/ Exact Anchor Text % Linking Root Domains w/ Exact Anchor Text Partial Anchor Text Links % Links w/ Partial Anchor Text Partial Anchor Text Root Doms. % Linking Root Domains w/ Partial Anchor Text Also, if say Exact Anchor Text Links is bolded purple, that means that URL has more Exact Anchor Text Links than any other URL in the report. Is that correct? Thanx David
Moz Pro | | BraveheartDesign0 -
Help with SEOmoz API
Hi guys, I'm trying to make API requests from my webserver via PHP. I'd like to retrieve data from the SEOmoz URL Metrics API. Unfortunately I always get the error response "unauthorized" even when I copy and paste the Sample Valid API Signature generated by your system into the browser. Is Signed Authentication not longer supported? I even tried the sample PHP Code SignedAuth.php but there's the same problem, too. If signed authentication is not longer available, do you have a code example for the basic http authorization? Thanks, Brandon
Moz Pro | | thegreatpursuit1 -
Where do I find "URLs Receiving Entrances Via Search" and "Non-Paid Keywords Sending Search Visits" in Google Analytics?
These are two metrics highlighted in the "Organic Traffic Data" report in the PRO campaigns. Since this report is composed of data linked from Google Analytics, I bet there's a way to find this same information in GA. So...anyone know how to do that in Google Analytics? I want this information for some long-tail productivity/potential research.
Moz Pro | | jcolman0