Mozcape API Batching URLs LIMIT
-
Guys, there's an example to batching URLs using PHP:
Which is the maximum number of URLs I can add to that batch?
-
Yes, it's weird. I currently have the Pro plan. I'm doing queries using 200 URLs at the same time with no issues ;). The only limitation is the time: I have to make a query and wait for 10 seconds to perform another
This request exceeds the limit allowed by your current plan.
Thank you Zach, have a good day!
-
As far as I know the limit was 10. That article on the APIWiki says the same. I do know, as a premium subscriber, that the # of batch requests per second is 200, however.
Quote from the API Wiki:
"You can submit up to 10 URLs for every batch request. Larger batch requests will return an HTTP 400 response."
I'd just be careful, because if your not getting a 400 response, they may end up throttling you.
Hope this helps
Zach -
Thanks Zachary. I made a test adding a lot of URLs. SEOmoz says the limit is 200 or less URLs at the same time. So, what I have to use... the 10 URLs limit or the 200?
Currently Im able to get data of 200 URLs at the same time, that's great for me!
-
SEOmoz recommends batch requests of 10 URLs according to their API wiki http://apiwiki.seomoz.org/url-metrics it states that any batch request larger than this will output a 400 error from the server.
Hope that helps!
Zach
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New URL, new physical address, New Name. 30 point drop in Domain Authority. Yikes.
I have a client who is asking for SEO help after renaming their business, getting a new URL, and somehow having an address change (without moving to a new location...weird...I know). This has set them back big time in terms of their domain authority (they went from a 46 to a 15 in DA). The web developers they work with put a 302 redirect in place from their old URL (home page), which had 10,477 links from 52 root domains, to their new URL's home page. Open site explorer shows that they now have 5 links! We can improve some of the local search set backs from the name and address change with a citation audit and clean up, but the domain name change is a killer. So here's my question or questions, really: Do we need to manually rebuild links with partner websites? I know there is debate around the actual link juice passed along from a 302 vs a 301 redirect (despite what has been publicly stated by Google). Or is this just a waiting game while old links get recrawled?
Moz Pro | | TheKatzMeow1 -
How do you create tracking URLs in Wordpress without creating duplicate pages?
I use Wordpress as my CMS, but I want to track click activity to my RFQ page from different products and services on my site. The easiest way to do this is through adding a string to the end of a URL (ala http://www.netrepid.com/request-for-quote/?=colocation) The downside to this, of course, is that when Moz does its crawl diagnostic every week, I get notified that I have multiple pages with the same page title and the dup content. I'm not a programming expert, but I'm pretty handy with Wordpress and know a thing or two about 'href-fing' (yeah, that's a thing). Can someone who tracks click activity in WP with URL variables please enlighten me on how to do this without creating dup pages? Appreciate your expertise. Thanks!
Moz Pro | | Netrepid0 -
Canonical URLs and Duplicate Page Content
My website (doctor directory) is getting a lot of duplicate page content & duplicate page title warnings from SEOmoz. The pages that are getting the warnings are doctors profiles which can be accessed at three different URLs. Problem is this should be handled by the canonical tag on the pages. So example below, all three open the same page: https://www.arzttermine.de/arzt/dr-sara-danesh/ https://www.arzttermine.de/arzt/dr-sara-danesh/gkv https://www.arzttermine.de/arzt/dr-sara-danesh/pkv Here's our canonical tag (on line 34): rel="canonical" href="http://www.arzttermine.de/arzt/dr-sara-danesh" /> So why is SEO moz crawling the page? We are getting hundreds of errors from this - and yet Google doesn't have any of the duplicate URLs indexed...
Moz Pro | | thomashillard0 -
Crawl Errors from URL Parameter
Hello, I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages associated with /login. I will see site.com/login?r=http://.... and have several duplicate content issues associated with those urls. Seeing this, I checked WMT to see if the Google crawler was showing this error as well. It wasn't. So what I ended doing was going to the robots.txt and disallowing rogerbot. It looks like this: User-agent: rogerbot Disallow:/login However, SEOmoz has crawled again and it still picking up on those URLs. Any ideas on how to fix? Thanks!
Moz Pro | | WrightIMC0 -
Finding the source of duplicate content URL's
We have a website that displays a number of products. The product has variations (sizes) and unfortunately every size has its own URL (for now anyway). Needless to say, this causes duplicate content issues. (And of course, we are looking to change the URL's for our site as soon as possible) However, even though these duplicate URL's exist, you should not be able to land on them by navigating through the site. In theory, the site should always display the link to the smallest size. It seems that there is a flaw in our system somewhere, as these links are now found in our campaign here on SEOmoz. My question: is there any way to find the crawl path that lead to the URL's that shouldn't have been found, so we can locate the problem?
Moz Pro | | DocdataCommerce0 -
SEOmoz API? – "Limited access is included ... PRO membership." ?
Can someone expand on what you actually get with your pro membership on the Site Intelligence API. API page. Thanks
Moz Pro | | josey0 -
How do I delete a url from a keyword campaign
I have a couple of urls that are associated with the keywords in my campaign. They are no longer valid so how do I remove them?
Moz Pro | | PerriCline0 -
Why is Open site Explorer showing: No Data Available for this URL
Hi there, Im having a few problems getting my site www.incarmotorfactors.co.uk up and running on SEOMoz and im not sure what im doing wrong.Firstly seomoz shows 2 links for my site... Which is wrong. Google shows alot more. However the most noticable problem so far is Opensite explorer. When i type in the web address it shows "No Data Available for this URL" The site is more then a year old and has a few links, can anybody tell me what the problem may be?
Moz Pro | | Ev840