Waiting 3 days for Crawl Test to complete
-
Being new to seomoz Im not sure if I understand the crawl test completely. You setup a campaign, enter all your info, rogerbot goes out and crawls your site and gives you results as to what your doing right and what is wrong or could use looking into.
So once I get my results, I make edits to my site pages. In my case Im getting lots of duplicate content and duplicate titles. So I go back and make adjustments and then submit a crawl test to see the change results.
In other tools Ive used in past I was able to re run crawl immediately and fine tune results on the fly. seomoz crawl test is still pending after three days. is this normal? or is there another way to make changes and run reports to see results instantly?
If your working on many sites and making changes, having to wait 3 or more days to see how your changes were received seems like a long time.
-
Hey everyone! Sorry about the odd crawl test activity. If you don't mind shooting us an email at help@seomoz.org with your PRO email address and the name of the domain that is stuck we can push then through on the back end. I would grab more information here, but we can't since it is a public forum. :S
Look forward to your emails and sorry about the wait!
-
You could use a program that crawls your site for same purpose. Lots of free apps out there.
-
Thanks alsvik,
Im kind of surprised as well how long it takes. My site has maybe 50 pages, so 3-4 days seems like along time to wait to see if you cleared out all the title and duplicate content issues or any other errors. Pretty disappointing to have to wait that long. So if one slips through the cracks its make a correction and wait another week to test and see results. I just cant see how this is practical in today's fast paced seo evolving world.
-
Hi Anthony.
It surprised me as well that it takes 3-4 days to crawl my site. I have +10.000 pages so i guess its ok.
Seomoz is crawling at a very low speed
Actually Seomoz crawls your site every week if you set up a campaign. Mine does! Every friday or saturday i get a crawl-report.
About the timeframe, try using GWT for meta desc and title errors!? GWT is updated monthly on terms of pages with identical title/meta descriptions. And if you make changes you shouldnt expect to see them have impact on SERP within the first month or two ...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Taking more than a day to index after the content changed
Hi everyone, As i got stuck with the confusion that - one of our website pages for the business located in Sharjah contents has been moderated and inspected the URL to google for index with new tags and contents. This is the URL which made the changes: https://www.socprollect-mea.com/sharjah-free-zone-company-registration/ and As i came to know that our page reflecting an issue "Valid items with warnings" once after inspecting the URL in the search console. Something which seems interesting and never experienced before - which is showing: "Products"warning - something like that. I came to know that - Missing field "Brand" and showing no global identifier. Does anybody know what it is and can u able to rectify this concern and get me a solution to index our URL faster on Google search. please?
On-Page Optimization | | nazfazy0 -
Ranking drop from 6 op 23 in one day - freaking out
Dear, Moz We have been hard at work going some off site and on site SEO. However yesterday we got around 1600 404 errors from google, and ranking dropped from 7 in front page to 25. What we did: I found an error in Htacces, where my partner had this (rewritebase with double // and rule with // - I quess this started creating urls for google, because twww.website.com//category-category-cateory OK. But google says that they will not effect your rankings because 404s? Second think i found was that we had some urls, which had canonical tag to a page called search. Now that search (duplicate of homepage) we 301 to our main homepage. Can that effect ranking? You have 404s that have canonical to a page that itselft redirects (301) to homepage. We also removed the / splash. Nothing more.. Below is the htaccess, that had the double // error. Please comment. Options +FollowSymLinks
On-Page Optimization | | advertisingcloud
RewriteEngine On
RewriteBase //
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} (.+)/$
RewriteRule ^ %1 [R=301,L]
RewriteCond %{HTTP_HOST} ^www.website.com [NC]
RewriteRule ^(.)$ http://website.com/$1 [L,R=301]
RewriteCond $1 ^(index.php)?$ [OR]
RewriteCond $1 .(gif|jpg|css|js|png|ico)$ [NC,OR]
RewriteCond %{REQUEST_FILENAME} -f [OR]
RewriteCond %{REQUEST_FILENAME} -d
RewriteRule ^(.)$ - [S=1]
RewriteRule .//index.php [L]
DirectoryIndex index.php0 -
Can the Lightboxes on My Site be Crawled?
I'm trying to optimize my site, but I have lightboxes and I don't know if they are visible to the search engines. If they aren't, could you suggest something that I could do? THANK YOU so much!!!!! My site is lymphexpo.com
On-Page Optimization | | bosleypalmer0 -
How to overcome blog page 1, 2, 3, etc having no or duplicate meta info?
As the above what is the best way to overcome having the same meta info on your blog pages (not blog posts) So if you have 25 blog posts per page once you exceed this number you then move onto a second blog page, then when you get to 50 you then move onto a 3rd blog page etc etc So if you have thousands f blog pages what is the best method to deal with this rather than having to write 100s of different meta titkes & descriptions? Cheers
On-Page Optimization | | webguru20141 -
I have more pages in my site map being blocked by the robot file than I have being allowed to be crawled. Is Google going to hate me for this?
Using some rules to block all pages which start with "copy-of" on my website because people have a bad habit of duplicating new product listings to create our refurbished, surplus etc. listings for those products. To avoid Google seeing these as duplicate pages I've blocked them in the robot file, but of course they are still automatically generated in our sitemap. How bad is this?
On-Page Optimization | | absoauto0 -
I built a website on magentogo - IrisScottPrints.com. The seomoz crawl report states 301 rel canonical crawl notices. What if anything should I change?
Wondering if I should remove "IRIS SCOTT PRINTS |" from all the title tags and/or change the url structure of the pages, to not include the breadcrumbs... I don't really understand the whole rel canonical structure thing. Also lots of errors on page title too long - does that really matter? Lots of faith in everyone here. Thanks in advance. Marcia
On-Page Optimization | | RedTrout0 -
SEO Moz crawl has 3 missing page title errors when they are clearly there.
My SEO Moz crawl today has highlighted for errors where page titles are empty missing. For example: http://www.musicliveuk.com/live-acts/hire-wedding-entertainment/wedding-entertainment-kent This page clearly has a title as do the other 3. Is it a bug in the system or am I missing something?
On-Page Optimization | | SamCUK0 -
Crawl Diagnostics - Duplicate Content and Duplicate Page Title Errors
I am getting a lot of duplicate content and duplicate page title errors from my crawl analysis. I using volusion and it looks like the photo gallery is causing the duplicate content errors. both are sitting at 231, this shows I have done something wrong... Example URL: Duplicate Page Content http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Duplicate Page Title http://www.racquetsource.com/PhotoGallery.asp?ProductCode=001.KA601 Would anyone know how to properly disallow this? Would this be as simple as a robots.txt entry or something a little more involved within volusion? Any help is appreicated. Cheers Geoff B. (a.k.a) newbie.
On-Page Optimization | | GeoffBatterham0