How often does seomoz crawl the site? Can you force a crawl at a specific time ?
-
How often does seomoz crawl the site? Can you force a crawl at a specific time ?
-
Thanks, sussed now
-
Yep. Depends on exactly when you set up your campaigns. You will get an email when your site has been crawled by the seomoz bot. Then shortly after that you will get a ranking update email.
You cant control the crawl rate.
-
Thanks guys, as I suspected, worth a look at this by moz dev team
-
Yeah, same here. It is done on a recurring weekly basis on the same day of the week you had initially setup the campaign.
-
All my campaigns (different sites) are crawled weekly. As far as I understand you can;t control this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can we disavow all spammy looking sites in OSE with a spam score of 5 or above?
Hello, We'd like to use OSE to make a disavow list. Can we just go through everything with a spam score of 5 or higher that looks like spam when we visit the site and disavow all of them? I'll be using Moz pro, are there any other free tools that I can utilize? What do I keep in mind? Thanks!
Moz Pro | | BobGW0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
How do a run a MOZ crawl of my site before waiting for the scheduled weekly crawl?
Greetings: I have just updated my site and would like to run a crawl immediately. How can I do so before waiting for the next MOZ crawl? Thanks,
Moz Pro | | Kingalan1
Alan0 -
SEOmoz link report VS. open site explorer
Hi, I run a campaign on one of my new clients in the links report - i see 1970 - external links and i can press " see more in open site explorer) when i press the button, open site explorer is opened but with a message that there is no link data on this website any advice? Are you familiar with another tool that can help me investigate links to website? Thank you SEOwise
Moz Pro | | iivgi0 -
SEOmoz Campaign Setup: What would you do?
I've been using moz for nearly a year now. Initially when I set up our site (rac.com.au) it was as a single campaign for the rac.com.au domain. However, our business has a number of different product offerings and marketing teams (Insurance, finance, travel/tourism, automotive, security etc) - would I be better off setting each of these 'sub' brands as a campaign on their own using the 'folder' campaign setup? For example, have a campaign setup for the Insurance business (rac.com.au/insurance) folder, Finance business (rac.com.au/finance) and so forth? The advantage I see to the folder campaign setup is more specific reporting for each of the business units. Are there disadvantages to this? Any recommendations would be greatly appreciated. Cheers Ryan
Moz Pro | | Hutch_e0 -
SEOmoz Dashboard Report: Crawl Diagnostic Summary
Hi there, I'm noticing that the total errors for our website has been going up and down drastically almost every other week. 4 weeks ago there were over 10,000 errors. 2 weeks ago there were barely 1,000 errors. Today I'm noticing it's back to over 12,000 errors. It says the majority of the errors are from duplicate page content & page title. We haven't made any changes to the titles or the content. Some insight and explanation for this would be much appreciated. Thanks, Gemma
Moz Pro | | RBA1 -
How to remove Duplicate content due to url parameters from SEOMoz Crawl Diagnostics
Hello all I'm currently getting back over 8000 crawl errors for duplicate content pages . Its a joomla site with virtuemart and 95% of the errors are for parameters in the url that the customer can use to filter products. Google is handling them fine under webmaster tools parameters but its pretty hard to find the other duplicate content issues in SEOMoz with all of these in the way. All of the problem parameters start with ?product_type_ Should i try and use the robot.txt to stop them from being crawled and if so what would be the best way to include them in the robot.txt Any help greatly appreciated.
Moz Pro | | dfeg0 -
What the hell...spam on SEOMOZ!
I received this in my Private Messages section: My name is Fatima,i saw your profile at/www.seomoz.org/today and became intrested in you,i will also like to know you the more,and i want you to send an email to my email address so i can give you my picture for you to know whom i am.Here is my email address (fatimababy06@yahoo.com) I believe we can move from here I am waiting for your mail to my email address above.Fatima(Remeber the distance or colour does not matter but love matters alot in life) How can somebody spam like this on protected forum?
Moz Pro | | IM_Learner2