Joined yesterday, today crawl errors (incorrectly) shows as zero...
-
Hi.
We set up our SEOMoz account yesterday, and the initial crawl showed up a number of errors and warnings which we were in the process of looking at and resolving.
I log into SEOMoz today and it's showing 0 errors,
Pages Crawled: 0 | Limit: 10,000
Last Crawl Completed: Nov. 27th, 2012 Next Crawl Starts: Dec. 4th, 2012errors, warnings and notices show as 0, and the issues found yesterday show only in the change indicators.There's no way of getting to the results seen yesterday other than waiting a week?We were hoping to continue working through the found issues!
-
Hi Neil,
Thanks for getting in touch with us, since this is one of our known issues, I will open a ticket for you so we can communicate more efficiently! Thanks for the comment Corey!
Neil I will talk to you on the ticket.
~Peter
SEOmoz Help Team.
-
You'll may want to check with support, but I suspect not. I've seen similar things happening with the 'moz toolset for our clients all the time. Eventually, the crawler comes back around for another run and it gets sorted.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Functionality of SEOmoz crawl page reports
I am trying to find a way to ask SEOmoz staff to answer this question because I think it is a functionality question so I checked SEOmoz pro resources. I also have had no responses in the Forum too it either. So here it is again. Thanks much for your consideration! Is it possible to configure the SEOMoz Rogerbot error-finding bot (that make the crawl diagnostic reports) to obey the instructions in the individual page headers and http://client.com/robots.txt file? For example, there is a page at http://truthbook.com/quotes/index.cfm month=5&day=14&year=2007 that has – in the header -
Moz Pro | | jimmyzig
<meta name="robots" content="noindex"> </meta name="robots" content="noindex"> This page is themed Quote of the Day page and is duplicated twice intentionally at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2004 and also at http://truthbook.com/quotes/index.cfm?month=5&day=14&year=2010 but they all have <meta name="robots" content="noindex"> in them. So Google should not see them as duplicates right. Google does not in Webmaster Tools.</meta name="robots" content="noindex"> So it should not be counted 3 times? But it seems to be? How do we gen a report of the actual pages shown in the report as dups so we can check? We do not believe Google sees it as a duplicate page but Roger appears too. Similarly, one can use http://truthbook.com/contemplative_prayer/ , here also the http://truthbook.com/robots.txt tells Google to stay clear. Yet we are showing thousands of dup. page content errors when Google Webmaster tools as shown only a few hundred configured as described. Anyone? Jim0 -
Crawl test from tools
Hi, I notice that the crawl test which is from the Research Tools doesn't really get a new crawl even though there is 2 crawl per day. It will only provide the data which was acquire from the crawl diagnostics in my pro account. There is no point for me to get the data which I get from my crawl diagnostic isn't it? Even seomoz provided with more than 2 crawl per day also useless in this case. This whole thing doesn't make sense as the crawl diagnostics will only perform a full crawl test once every week. but even the crawl test also not helping any thing out for me.
Moz Pro | | hanzoz0 -
Site Not Indexing & SEOMoz Reporting ZERO On-Page Report Crawls
Any help on this would be MUCH appreciated. One of my sites, aironeairsolutionsinc.com, has recently been rebuilt and the pages tweaked for some basic optimization. Based on my experience, those tweaks (geared toward keywords with relatively low competition locally) usually bump my local sites up into the top 20 or 30 at worst. 3 weeks later, it seems my site is still not indexing with Google. In addition, I AM NOTICING THAT THE ON PAGE REPORTS IN SEO MOZ ARE NOT REGISTERING THAT ANY PAGES ARE BEING CRAWLED. Again, any help from Moz staff would be awesome! :} Thanks, Ricky
Moz Pro | | RickyShockley0 -
54 new 404 errors on my website?
Hi There In the latest report I have 54 404-errors. All last week, previously I had 2 404s that I fixed. In report say: Title404 : ErrorMeta DescriptionTraceback (most recent call last): File "build/bdist.linux- x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site- packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 404 Not FoundMeta RobotsNot present/emptyMeta RefreshNot present/empty Are these normal 404 errors I have to look at and fix? Or is this some script that running on my server and causing errors? In general - what should I do to fix this? Thanks Dean
Moz Pro | | Passanger880 -
How does SEOMoz crawl sites? Does it follow the sitemap?
I only ask because in my report it returned a few "no follow" links. Was wondering how SEOMoz operates.
Moz Pro | | Pikimal0 -
Sub-domain not crawled
One of our sites was recently re-designed. The home page is a landing page (www.labadieauto.com) and I moved the blog to this domain (labadieauto.com/blog/) and put a link is the bottom left of the home page. Since the change the SEOMOZ campaign overview is showing only 1 page crawled. This is not setup as a sub-domain so why isn't it showing in the crawl? Help!
Moz Pro | | LabadieAuto0 -
How long does a crawl take?
A crawl of my site started on the 8th July & is still going on - is there something wrong???
Moz Pro | | Brian_Worger1 -
Ruling out subfolders in pro tool crawl
Is there a way to "rule out" a subfolder in the pro dashboard site crawl? We're working on a site that has 500,000+ pages in the forums, but its the CMS pages we're optimizing and don't want to spend the 10k limit on forum pages.
Moz Pro | | DeepRipples0