Campaign Crawl
-
I have a site with 8036 pages in my sitemap index. But the MozBot only Crawled 2169 pages. It's been several months and each week it crawls roughly the same number of pages. Any idea why I'm not getting fully crawled?
-
The best and most efficient solution with any question regarding Moz crawls or tools is to go directly to them with questions (sometimes there's enough unique behind the scenes stuff going on that forum people don't have access to). Their team is great and has helped me a handful of times, quickly and politely.
-
Hi Jack,
Server errors. Redirect loops. Disallowed pages in robots.txt. etc.
Here are some suggestions from www.seomoz.org/help/crawl-diagnostics:
"Why didn’t you crawl all my pages? I only got a one page crawl. Looks like you missed a bunch!
If you suspect you didn’t get a full crawl, or Rogerbot missed some of your pages, there could be several reasons why this happens.
- We only crawl a maximum of 400 links per page. If several pages of your site all have the same 400 links on each page, we may not discover all the pages on your site. Try optimizing your navigation to reduce the number of links.
- Does your navigation rely on JavaScript? Can visitors navigate your site with JavaScript disabled? SEOmoz doesn’t crawl JavaScript, so make sure your links work in all browsing environments.
- Does your site consist of multiple subdomains? Crawls are restricted to the subdomain you set your campaign up on. This means that in general, we don't crawl multiple subdomains. You can solve this by specifying a “Root Domain” crawl in the setup process. (This requires starting a new campaign.)"
If you are certain you don't have any of the above problems, then I would suggest contacting help@seomoz.org
Good luck.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
7,608 High Priority Crawl Diagnostic problems
Hey There, I have an e-commerce site that is showing 7,608 High Priorities to fix - 7,536 are duplicate content. What's the most effective process to start with? I'm open to outsourcing some of the work to an expert - email me on dave@emanbee.com Thanks for your time, Dave
Moz Pro | | emanbee0 -
1 page crawled - again
Just had to let you know that it happend again. So right now we are at 2 out of the last 4 crawls. Uptime here is 99,8% for the last 30 days, with a small downtime due to an update process at the 18/5 from around 2:30 to 4:30 GMT In relation to: http://moz.com/community/q/1-page-crawled-and-other-errors
Moz Pro | | alsvik0 -
Order of urls in SEOMoz crawl report
Is there any rhyme or reason to the order of urls in the SEOMoz crawl report, or are the urls just listed in random order?
Moz Pro | | LynnMarie0 -
Crawl Diagnostics - Canonical Question
On one of my sites I have 61 notices for Rel Canonical. Is it bad to have these or is this just something that's informative?
Moz Pro | | kadesmith0 -
Too Many On-Page Links: Crawl Diag vs On-Page
I've got a site I'm optimizing that has thousands of 'too many links on-page' warnings from the SeoMoz crawl diagnostic. I've been in there and realized that there are indeed, the rent is too damned high, and it's due to a header/left/footer category menu that's repeating itself. So I changed these links to NoFollow, cutting my total links by about 50 per page. I was too impatient to wait for a new crawl, so I used the On Page Reports to see if anything would come up on the Internal Link Count/External Link Count factors, and nothing did. However, the crawl (eventually) came back with the same warning. I looked at the link Count in the crawl details, and realized that it's basically counting every single '<a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p=""></a> <a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p="">1. Is no-follow a valid strategy to reduce link count for a page? (Obviously not for SeoMoz crawler, but for Google)</a> <a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p="">2. What metric does the On-Page Report use to determine if there are too many Internal/External links? Apologies if this has been asked, the search didn't seem to come up with anything specific to this.</a>
Moz Pro | | icecarats0 -
Why is the SEOmoz crawler crawling the old version of our website?
Hello, I'm a new SEOmoz member. On Dec. 2nd, after completely redesigning our website, we migrated to a new hosting company by switching our DNS to the new server. The vast majority of the URLs have changed and we configured redirects of the old URLs to the new ones. Although, this task is not completed yet. After the migration, I created an account on SEOmoz to be able to track our progress and find the issues to fix to optimize our SEO. For some reason, in the SEOmoz reports it is the old URLs that show up. Unless the crawler does not actually crawl the pages and only uses the indexed pages to generate its report, I don't understand how could this possible. Anyone has a clue? When will the new URLs be indexed by SEOmoz and the major search engines? Thanks for your help!
Moz Pro | | Gestisoft-Qc0