Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do you guys/gals define a 'row?'
-
I have a question about calls to the API and how these are measured. I noticed that the URL Metrics calls allow a batch of multiple URLs.
We're in a position where we need link data for multiple websites; can we request a single row of data with link information for multiple URLs, or do we need to request a unique row for each URL?
-
Hi Stephen,
If you imported the information you received from a request to our API into a spreadsheet, you would have rows of information. The number of rows depends on the request you make. If you ask for 200 links from our Top Back Links API, then you’ll get 200 rows of information about backlinks. If you submit a single URL to our Page Metrics API, then you’ll get one row of information back from the API. That row of information would include page metrics about the URL. If you do a batch request to the Page Metrics API and submit 5,000 URLs, then you’ll receive 5,000 rows of information about the URLs you submitted. The number of rows you get back from your request depends on which API you’re using and the amount of information you ask for in your request. You only pay for rows of information you actually receive.
Thanks,
Joel.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I set blog category/tag pages as "noindex"? If so, how do I prevent "meta noindex" Moz crawl errors for those pages?
From what I can tell, SEO experts recommend setting blog category and tag pages (ie. "http://site.com/blog/tag/some-product") as "noindex, follow" in order to keep the page quality of indexable pages high. However, I just received a slew of critical crawl warnings from Moz for having these pages set to "noindex." Should the pages be indexed? If not, why am I receiving critical crawl warnings from Moz and how do I prevent this?
Moz Pro | | NichGunn0 -
GWMT / Search Analytics VS OpenSiteExplorer
Just had the experience of using OSE data to show what we call "linkrot" to a client -- only to find that GWMT / Search Analytics shows no such thing. Fortunately the client is an old friend and no face was lost, but it was dicey there for a bit as I have come to rely on and reference OSE again and again and again, OSE showed Domain Authority dropping by about 1/3 in the last 12 months, presumably due to old links getting broken, linking sites changing their architecture etc. And of course, ranking is tanking, as you would expect. But Google shows many more (and much more spammy looking!) backlinks. Has anyone had any experience benchmarking the 2 data sets of backlinks against each other? Dr Pete?
Moz Pro | | seo_plus
Does one update more frequently than another? Do you trust one more than another?? If so, why?? Thanks!0 -
Pages with Temporary Redirects on pages that don't exist!
Hi There Another obvious question to some I hope. I ran my first report using the Moz crawler and I have a bunch of pages with temporary redirects as a medium level issue showing up. Trouble is the pages don't exist so they are being redirected to my custom 404 page. So for example I have a URL in the report being called up from lord only knows where!: www.domain.com/pdf/home.aspx This doesn't exist, I have only 1 home.aspx page and it's in the root directory! but it is giving a temp redirect to my 404 page as I would expect but that then leads to a MOZ error as outlined. So basically you could randomize any url up and it would give this error so I am trying to work out how I deal with it before Google starts to notice or before a competitor starts to throw all kinds at my site generating these errors. Any steering on this would be much appreciated!
Moz Pro | | Raptor-crew0 -
Text analysis Tool: WDF*IDF - Within Document Freqeuncy x Inverse Document Frequency / tools?
Checking Keyword-density is just to primitive... what is your recommendation for the subject WDFPIDF?
Moz Pro | | inlinear
The SEO-Tool onpage.org (german) offers an interesting tool to analyse your text. But there are differences between languages and factors like proximities, synonyms etc. What are your experiences? tools? does mOz develop a tool for this? This would be a nice Feature for the On-Page Grader! best regards,
Holger1 -
I am looking for SEO tips specifically for magazine site's
I have a client who has a website that is based on a magazine. They make their money through advertisement I am primarily an inbound marketer I would be very grateful if anyone out there has any tips for a site that has been around for quite a while ( over 10 years) we are transforming the site from HTML into WordPress then hosting it with a fast managed WordPress host using CDN. I feel the lack of links is an obvious place to start however if there's anything specific to magazine based websites I would be more than grateful to hear your opinions. Thank you all in advance. Sincerely, Thomas von Zickell
Moz Pro | | BlueprintMarketing0 -
Batch lookup domain authority on list of URL's?
I found this site the describes how to use excel to batch lookup url's using seomoz api. The only problem is the seomoz api times out and returns 1 if I try dragging the formula down the cells which leaves me copying, waiting 5 seconds and copying again. This is basically as slow as manually looking up each url. Does anyone know a workaround?
Moz Pro | | SirSud1 -
Fetch googlebot for sites you don't own?
I've used the "fetch as googlebot" tool in Google webmaster tools to submit links from my site, but I was wondering if there was any type of tool or submission process like this for submitting links from other sites that you do not own? The reason I ask is, I worked for several months to get a website to accept my link as part of their dealer locator tool. The link to my site was published a few months ago, however I don't think google has found it and the reason could be because you have to type in your zip code to get the link to appear. This is the website that I am referencing: http://www.ranchhand.com/dealers.php?zip=78070&radius=20 (my website is www.rangeroffroad.com) Is there any way for Google to index the link? Any ideas?
Moz Pro | | texmeix0 -
Seomoz Spider/Bot Details
Hi All Our website identifies a list of search engine spiders so that it does not show them the session ID's when they come to crawl, preventing the search engines thinking there is duplicate content all over the place. The Seomoz has bought a over 20k crawl errors on the dashboard due to session ID's. Could someone please give the details for the Seomoz bot so that we can add it to the list on the website so when it does come to crawl it won't show it session ID's and give all these crawl errors. Thanks
Moz Pro | | blagger1