Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Unsolved Does Moz Pro include Moz Local
-
My client has bought about six Moz Local accounts and are pleased with results. We have not yet used your Moz Pro program. The client might be interested in switching to the Moz Pro if those Moz Local accounts can be included into it. Please let me know as soon as possible. Thanks!
-
@Vmaya You guys should consider combining the packages.
-
Moz Pro does not include Moz Local, but it offers a number of features that can be helpful for dissertation help online, such as keyword research, link analysis, and competitive intelligence.
-
Well it does includes Moz Local because I am using it for my site which provides children's book writing services as well as author website services and I am using MOZ local to rank on local SERP of USA market.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved can I set up a white-labeled version of Moz local
I am wondering if Moz Pro account users have the option to set up a white-labeled version of the Moz Local Listings Checker? And if so, how? It would be an excellent tool to entice customers to sign up for our services.
Moz Local | | KoreyKorfi0 -
Unsolved Ooops. Our crawlers are unable to access that URL
hello
Moz Pro | | ssblawton2533
i have enter my site faroush.com but i got an error
Ooops. Our crawlers are unable to access that URL - please check to make sure it is correct
what is problem ?0 -
Unsolved Is Moz Able to Track Internal Links Per Page?
I am trying to track internal links and identify orphan pages. What is the best way to do this?
Moz Pro | | WebMarkets0 -
Unsolved New Google SERP Now Lazy Loading
Hello, Google has changed the way it displays it's SERPs with lazy loading new listings. Because of this the Moz Chrome Extension no longer shows results past first results. I also often find that when a site is ranked above the Maps, it doesn't display results. Is there a fix for this?
Moz Bar | | HercMagnus0 -
Unsolved WHY IS NO ONE AT MOZ CUSTOMER SERVICE RESPONDING??! THEY STOLE MY MONEY
AFTER MAKING PAYMENT FOR MOZ LOCAL I WAS ASKED TO CLICK THE LOCATION ID TO GET STARTED BUT IT JUST LOOPS TO THE SUBSCRIPTION PAGE. ALL BUTTONS JUST LOOP. I SENT 10 EMAILS TO MOZ AND THEY CLAIM TO RESPOND IN 24 HOURS BUT IT HAS BEEN LONGER THAN THAT! IS THIS A SCAM SERVICE AND WAS I JUST ROBBED OF MY MONEY!!! HOW DO I CALL THESE PEOPLE SINCE THEY ARE NOT RESPONDING TO EMAILS!!! ???/
Moz Local | | ShopSplendor20 -
Ahrefs vs Moz
Hi! I noticed the Moz DA en the Ahrefs DA are very different. Where https://www.123opzeggen.nl/ has a DA of 10 at MOZ, the DA at Ahrefs is 26. Where does this big difference come from? Do you measure in different ways? I hope you can answer this question for me. Thank you in advance!
Moz Pro | | NaomiAdivare2 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0