Crawl Disgnosis only crawling 250 pages not 10,000
-
My crawl diagnosis has suddenly dropped from 10,000 pages to just 250. I've been tracking and working on an ecommerce website with 102,000 pages (www.heatingreplacementparts.co.uk) and the history for this was showing some great improvements. Suddenly the CD report today is showing only 250 pages! What has happened? Not only is this frustrating to work with as I was chipping away at the errors and warnings, but also my graphs for reporting to my client are now all screwed up. I have a pro plan and nothing has (or should have!) changed.
-
Hey Scott,
I just checked out your campaigns and everything looks good right now. We are really sorry about any inconveniences this may have caused. Let me update you on what happened and what we have done to make sure it doesn't happen in the future.
Over the weekend our server hosting provider experienced some temporary power outages that last for a few hours. When this happened some of our databases that contain user membership status went offline. When this happened our crawlers assumed that the campaigns had been archived and when the database servers came back online then the crawlers thought the campaigns had been unarchived.
In the past we have had the practice of kicking off a 250 page starter crawl when a campaign has been unarchived and then scheduling the full crawl for 7 days out. Your campaign would have received a full crawl on it's next scheduled crawl though. This is much like what happens when you first create your campaign. This isn't ideal for a few reasons though. One being a scenario like what happened over the weekend and two that it can skew your historical data by having a 250 page crawl stuck in the middle, even if archiving was intentionally done.
Moving forward we will be implementing a change to this that makes it so when you unarchive a campaign your full crawl will be scheduled and you won't receive a starter crawl. If you need more immediate crawl data then I recommend using our crawl test tool. With that tool you can receive up to 3,000 pages crawled. The only difference being it comes in the form of a csv file without the pretty web interface.
Let me know if you have any additional questions. Also, in the future if you are experiencing any issues with your service go ahead an let our support team know. If you go to seomoz.org/help you can generate a help ticket quite easily. By generating a customer support ticket our Help Team will keep you up to date on any issues with your account and work with you to resolve any issues as quickly as possible.
Again, my sincere apologies for this issue with your crawl.
Have a great day!
Kenny
-
Many thanks Keri
-
Hi Scott,
We have rolled out a fix for this! I'm waiting to hear how long it will take to get through the backlog of crawls, but did want to let you know that your campaign is being worked on.
Keri
-
Thanks Keri. If you could please keep me informed that will help me to explain this to clients.
regards,
Scott.
-
I think we've had a bug, Scott. A couple of SEOmoz staff also got emails that the starter crawl had finished. We're looking into this to figure out what has happened, and really apologize. I'm assigning this to the help desk, and they'll commenting when we have some more information.
-
If you have run crawler today than yes seomoz default run 250 pages and than crawler takes 7 days to scan all your website pages..
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Crawl Error
In moz crawling error this message is appears: MOST COMMON ISSUES 1Search Engine Blocked by robots.txt Error Code 612: Error response for robots.txt i asked help staff but they crawled again and nothing changed. there's only robots.XML (not TXT) in root of my webpage it contains: User-agent: *
Moz Pro | | nopsts
Allow: /
Allow: /sitemap.htm anyone please help me? thank you0 -
Is it normal for Moz to report on nofollow pages in crawl diagnostics?
I have a dev version of my website, for example, devwww.website.com. The htaccess page has a noindex and nofollow request, but I got crawl issues reported from these pages in my Moz report. Does this mean that I don't have the development site hidden from search like I thought I did?
Moz Pro | | houstonbrooke0 -
Crawl Errors and Notices drop to zero
Hi all, After setting up a campaign in Moz the crawl is successful and it showed the Errors and Warnings in crawl diagnostics (each one had about 40-50), but after a few days the number dropped to zero. Only the "notices" seems to stay normal, with a slight drop since the campaign set up, but not dropping to zero. I set this campaign up in a colleague's account and the same thing happened shortly after set up. I didn't find any Q&A already posted so any insight is appreciated!
Moz Pro | | Vanessa120 -
Concerned About Individual Pages
Okay. I've setup a campaign for www.site.com and given a list of keywords. So after the initial crawl we'll have some results. What I'm looking for tho is how do individual pages on my site rank for the list of keywords given. And then be able to go to a screen in seomoz with data for that particular page with recommendations and stuff like that. Is this what's going to happen or do I need to create a campaign for each url i want to track? If all will work as I'd like in the example above, should I then add the second list of keywords that some other pages should rank for? Will it get to be a big mess or can I relate the keywords to pages in some way? It seems like what I'm looking for is what this program should be... Thanks!
Moz Pro | | martJ0 -
Not all pages are being crawled
I am set up on the PRO plan, I was under the impression that it would crawl up to 10,000 pages. My site has just over 200 pages, but whenever I am crawled it only crawls 121 pages. Is this normal? It's hard to know how reliable my data is because a significant amount of pages are missing.
Moz Pro | | KristinHarding0 -
How to increase the page authority
Need good rank on this 2 keyword, plz help me if anyone can help. Keywords are : gift card http://www.giftbig.com/gift-cards.html gift voucher http://www.giftbig.com/gift-vouchers.html
Moz Pro | | Joydeep_das0 -
Twitter Page Authority Score?
I've been doing some competitive research in Open Site Explorer and many of our competitors have Twitter accounts very similar to ours. Their Twitter pages are usually one of the pages with linking to their website with the most Page Authority. The incoming links from Twitter are a "no follow" as you would guess. This has been the case for a large number of well ranking sites I have looked at. www.dremed.com also has a Twitter account at: https://twitter.com/#!/DREmed . However, Open Site Explorer does not list the Twitter link as an incoming link at all ( or if it does it has no Page Authority ). The Twitter account page seems very similar in nature to other competing Twitter pages. I'm not sure why it does not ALSO pull a high Page Authority score??? Do you know why this might be? Best, Justin
Moz Pro | | justinjeffries0 -
Page and Domain Authority and other bits
Hi, I am in the process of finding blogs to have a few articles published with a couple of links in each. Articles will all be unique and relevant to the link I drop in and relevant in someway to the reader However I have a few questions. My site is a designer menswear site, so I have picked fashion and sports sites first and foremost to have the articles published. Now, I have a guy who owns about 30 different websites. 2 of them are sports based and about 10 are fashion based. Around $10-$15 an article. I have ran them all through the Open Site Explorer Tool and picked out the best ranked ones. Now my problem is, how do I know if its a good site to not only list an article on, but to pay for it as well. The sites page ranks are around the 30-45 range, the domains are around the 35-45 range. What is a good range to have? I know the higher the better but is 30-45 good enough to pay for? (I don't mind paying the $10 each (£7 my money) for each one) Also as he is quoted me in dollars, I assume there all USA based, so majority of users are USA based. Well I am UK based and only ship to the UK. Will this matter as much if I am trying to gain backlinks? Obviously a UK based site, would be ideal, but is it a case of getting more external links on the web for Google to find, as long as they are relevant to the user? Any help would be great. Thanks Will
Moz Pro | | WillBlackburn0