Schedule crawls for 2 subdomains every 24 hours
-
I saw at this link:
http://pro.seomoz.org/tools/crawl-test
"As a PRO member, you can schedule crawls for 2 subdomains every 24 hours, and you'll get up to 3,000 pages crawled per subdomain."
However I am having trouble finding where to schedule this 24 hour crawl in my Pro Dashboard. I did not see the option for this setting in the crawl diagnostics tab or in the campaign settings section from the dashboard home page. Can you help?
thanks!
Michael
-
Generally, I correct the error and then wait until the next crawl if I'm confident that I've fixed the problem.
If I'm unsure then I may run a manual crawl using the crawling tool, however for me the format just isn't as easy to read as the weekly crawl diagnostics.
-
Interesting, thank you both for the feedback. I am new to SEOMoz and am just curious how most users use the crawl tool when they fix errors...do you wait until the next week's crawl to see if the error is resolved? or do you immediately use the crawl test tool to check and see if their changes corrected the errors..(saving time) No right or wrong answer here, just curious as to how most SEO Moz pro users the tool. Any tips or tricks from other users on how they use the tool no matter how big or small would be greatly appreciated! thanks
-
Shelly is right on here. Send an email to help@seomoz.org and they'll answer your question for you.
-
You can use the crawl test tool to run a 2 crawls per day for up to 3,000 pages, however as far as I'm aware the SEO Tools are moving more towards campaign setups where your sites are crawled once per week (or within 24 hours for a new campaign).
It might be worth referring a request to SEOMoz help to get the team to clarify
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz crawling doesn't show all of my Backlinks
Hello, I'm trying to make an SEO backlinks report on my website When using the Link Explorer, I see only a few backlinks while I have much more backlinks on this website. Anyone has an idea about how to fix this issue. How can I check and correct this? My website is www.signsny.com.
Moz Pro | | signsny1 -
Crawl Diagnostics - 350 Critical errors? But I used rel-canonical links
Hello Mozzers, We launched a new website on Monday and had our first MOZ crawl on 01/07/15 which came back with 350+ critical errors. The majority of these were for duplicate content. We had a situation like this for each gym class: GLOBAL YOGA CLASS (canonical link / master record) YOGA CLASS BROMLEY YOGA CLASS OXFORD YOGA CLASS GLASGOW etc All of these local Yoga pages had the canonical link deployed. So why is this regarded as an error by MOZ? Should I have added robots NO INDEX instead? Would think help? Very scared our rankings are gonna get effected 😞 Ben
Moz Pro | | Bendall0 -
How can I set up a campaign to track just directories on a specific subdomain?
I am trying to setup a campaign to track a specific subdomain and all its directories. For example, want to track example.abc.com/11111 and example.abc.com/22222 and so on. No interest in tracking abc.com itself. Is this possible?
Moz Pro | | BalihooSearch1 -
Summarize your question.Is it possible to request another unscheduled crawl?
We have just sorted a couple of issues on the website which threw the crawl into spasm and gave us hundreds of hugely long URLs. We are pretty sure that we have corrected this and do not want to wait another week to check what SEOMOZ comes up with. Is there anyway that we can request a special crawl of the website so that we can hopefully just be left any legitimate remaining issues?
Moz Pro | | dmckenzie4560 -
Help with URL parameters in the SEOmoz crawl diagnostics Error report
The crawl diagnostics error report is showing tons of duplicate page titles for my pages that have filtering parameters. These parameters are blocked inside Google and Bing webmaster tools. I do I block them within the SEOmoz crawl diagnostics report?
Moz Pro | | SunshineNYC0 -
I have a Rel Canonical "notice" in my Crawl Diagnostics report. I'm presuming that means that the spider has detected a rel canonical tag and it is working as opposed to warning about an issue, is this correct?
I know this seems like a really dumb question but the site I'm working on is a BigCommerce one and I've been concerned about canonicalisation issues prior to receiving this report (I'm a SEOmoz pro newbie also!) and I just want to be clear I am reading this notice correctly. I presume this means that the site crawl has detected the rel canonical tag on these pages and it is working correctly. Is this correct?? Any input is much appreciated. Thanks
Moz Pro | | seanpearse0 -
Crawl reports urls with duplicate content but its not the case
Hi guys!
Moz Pro | | MakMour
Some hours ago I received my crawl report. I noticed several records with urls with duplicate content so I went to open those urls one by one.
Not one of those urls were really with duplicate content but I have a concern because website is about product showcase and many articles are just images with href behind them. Many of those articles are using the same images so maybe thats why the seomoz crawler duplicate content flag is raised. I wonder if Google has problem with that too. See for yourself how it looks like: http://by.vg/NJ97y
http://by.vg/BQypE Those two url's are flagged as duplicates...please mind the language(Greek) and try to focus on the urls and content. ps: my example is simplified just for the purpose of my question. <colgroup><col width="3436"></colgroup>
| URLs with Duplicate Page Content (up to 5) |0 -
Does the SEOMoz weekly crawl that highlights no meta description tag, take into account if there is a meta robots noindex,follow tag on the pages it indicates the missing meta descriptions?
The weekly crawl website report is telling me that there are pages that have missing meta description tags, yet I've implemented meta robots tags to 'noindex, follow' those pages which are visible in those page source files. As far as Google Is concerned, surely this then won't be a problem since it is being instructed NOT to consider these specific pages for indexing. I am assuming that the weekly SEOmoz website crawl is simply throwing the missing meta description crawl findings into its report without itself observing that the particluar URL references contain the meta robots 'noindex,follow' tag ???? Appreciate if you can clairfy if this is the case. It would help me understand that (at least in terms of my efforts towards Google) your own crawl doesn't observe the meta robots tag instruction, hence the resultant report's flagging the discrepancy.
Moz Pro | | callassist0