Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is there a tool to upload multiple URLs and gather statistics and page rank?
-
I was wondering if there is a tool out there where you can compile a list of URL resources, upload them in a CSV and run a report to gather and index each individual page.
Does anyone know of a tool that can do this or do we need to create one?
-
Thanks but unfortunately all the URLs are from different domains
-
Are the URLs all on the same domain? If so, some tools let you enter the root domain and any ranking listings on the root domain for your keywords are reported on. Makes life a lot easier.
-
No sorry... we subscribed to a gold plan and are able to upload 300 url's. That only seven copy paste actions.
-
That's great but I was hoping for something that i can upload 2000 urls rather then 20. Do you know anything that has that capacity?
-
You can use majesticseo.com if you want bulk backlink information. The tool also gives you, what they call a/c rank (something like pagerank) and the alexa ranking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is one page with long content better than multiple pages with shorter content?
(Note, the site links are from a sandbox site and has very low DA or PA) If you look at this page, you will see at the bottom a lengthy article detailing all of the properties of the product categories in the links above. http://www.aspensecurityfasteners.com/Screws-s/432.htm My question is, is there more SEO value in having the one long article in the general product category page, or in breaking up the content and moving the sub-topics as content to the more specific sub-category pages? e.g. http://www.aspensecurityfasteners.com/Screws-Button-Head-Socket-s/1579.htm
Moz Pro | | AspenFasteners
http://www.aspensecurityfasteners.com/Screws-Cap-Screws-s/331.htm
http://www.aspensecurityfasteners.com/Screws-Captive-Panel-Scre-s/1559.htm0 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
Tool recommendation for Page Depth?
I'd like to crawl our ecommerce site to see how deep (clicks from home page) pages are. I want to verify that every category, sub-category, and product detail page is within three clicks of the home page for googlebot. Suggestions? Thanks!
Moz Pro | | Garmentory0 -
How long for authority to transfer form an old page to a new page via a 301 redirect? (& Moz PA score update?)
Hi How long aproximately does G take to pass authority via a 301 from an old page to its new replacement page ? Does Moz Page Authority reflect this in its score once G has passed it ? All Best
Moz Pro | | Dan-Lawrence
Dan3 -
Text analysis Tool: WDF*IDF - Within Document Freqeuncy x Inverse Document Frequency / tools?
Checking Keyword-density is just to primitive... what is your recommendation for the subject WDFPIDF?
Moz Pro | | inlinear
The SEO-Tool onpage.org (german) offers an interesting tool to analyse your text. But there are differences between languages and factors like proximities, synonyms etc. What are your experiences? tools? does mOz develop a tool for this? This would be a nice Feature for the On-Page Grader! best regards,
Holger1 -
Page Authority is the same on every page of my site
I'm analyzing a site and the page authority is the exact same for every page in the site. How can this be since the page authority is supposed to be unique to each page?
Moz Pro | | azjayhawk0 -
Duplicate page titles are the same URL listed twice
The system says I have two duplicate page titles. The page titles are exactly the same because the two URLs are exactly the same. These same two identical URLs show up in the Duplicate Page Content also - because they are the same. We also have a blog and there are two tag pags showing identical content - I have blocked the blog in robots.txt now, because the blog is only for writers. I suppose I could have just blocked the tags pages.
Moz Pro | | loopyal0 -
Use of the tilde in URLs
I just signed up for SEOMoz and sent my site through the first crawl. I use the tilde in my rewritten URLs. This threw my entire site into the Notice section 301 (permanent redirect) since each page redirects to the exact URL with the ~, not the %7e. I find conflicting information on the web - you can use the tilde in more recent coding guidelines where you couldn't in the old. It would be a huge thing to change every page in my site to use an underscore instead of a tilde int he URL. If Google is like SEOMoz and is 301 redirecting every page on the site, then I'll do it, but is it just an SEOMoz thing? I ran my site through Firebug and and all my pages show the 200 response header, not the 301 redirect. Thanks for any help you can provide.
Moz Pro | | fdb0