I put a crawl on my site via seomoz and has come back saying only 1 page has been crawled.It has been over a week now can anyone help?
-
Crawling the pages of site
-
Thanks Keri,
Very much appreciated...
-
Hi Jay,
Here's a page the help desk wrote about why your site may only have one page crawled. Your best bet is to read that, then if you're still having problems, open up a ticket with the help desk either via the web or by sending an email to help@seomoz.org.
http://seomoz.zendesk.com/entries/409821-why-isn-t-my-site-being-crawled-you-only-crawled-one-page
Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why did Moz crawl our development site?
In our Moz Pro account we have one campaign set up to track our main domain. This week Moz threw up around 400 new crawl errors, 99% of which were meta noindex issues. What happened was that somehow Moz found the development/staging site and decided to crawl that. I have no idea how it was able to do this - the robots.txt is set to disallow all and there is password protection on the site. It looks like Moz ignored the robots.txt, but I still don't have any idea how it was able to do a crawl - it should have received a 401 Forbidden and not gone any further. How do I a) clean this up without going through and manually ignoring each issue, and b) stop this from happening again? Thanks!
Moz Pro | | MultiTimeMachine0 -
How Can I Solve a Spam Score Of 33% On A Site
Goof Day pls Am having S Spam Score Of 33% On my Blog Naijazeal.Com And I Check In my Google webmaster and i analyze all the link i see and i see little with high spam Score But Checking My Spam Score on Moz, It showi Extermal link below but most of the link showing below is not in my Seach console Pls help
Moz Pro | | DAMIDEZ0 -
Site Crawl 4xx Errors?
Hello! When I check our website's critical crawler issues with Moz Site Crawler, I'm seeing over 1000 pages with a 4xx error. All of the pages that are showing to have a 4xx error appear to be the brand and product pages we have on our website, but with /URL at the end of each permalink. For example, we have a page on our site for a brand called Davinci. The URL is https://kannakart.com/davinci/. In the site crawler, I'm seeing the 4xx for this URL: https://kannakart.com/davinci/URL. Could this be a plugin on our site that is generating these URLs? If they're going to be an issue, I'd like to remove them. However, I'm not sure exactly where to begin. Thanks in advance for the help, -Andrew
Moz Pro | | mostcg0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Only few pages (308 pages of 1000 something pages) have been crawled and diagnosed in 4 days, how many days till the entire website is crawled complete?
Setup campaign about 4-5 days ago and yesterday rogerbot said 308 pages were crawled and the diagnostics were provided. This website has over 1000+ pages and would like to know how long it would take for roger to crawl the entire website and provide diagnostics. Thanks!
Moz Pro | | TejaswiNaidu0 -
Has SEOMoz considered making it possible to follow a specific Q & A thread via Twitter and/or Email?
I think it would be great if there was a way to follow a particular Q & A thread without having to post a comment. There are so many great topics and discussions in here and sometimes I lose track of some that I'd like to continue following as people post, without having to post in the thread myself. Have you guys (Mozzers) ever considered adding a "Follow this thread on Twitter" or "Follow this thread by email" function? Or is there already a way to do this (without posting), that I just haven't found?
Moz Pro | | danatanseo1 -
How do I find the corresponding duplicate content pages from my SEOmoz report?
Once I have run my report and the duplicate content pages come up, is there a way to find out which pages have the duplicate content on them? I have one URL but where can I find the duplicate content that corresponds to it? Thanks Barry
Moz Pro | | MrBarrytg0 -
Why is the SEOmoz crawler crawling the old version of our website?
Hello, I'm a new SEOmoz member. On Dec. 2nd, after completely redesigning our website, we migrated to a new hosting company by switching our DNS to the new server. The vast majority of the URLs have changed and we configured redirects of the old URLs to the new ones. Although, this task is not completed yet. After the migration, I created an account on SEOmoz to be able to track our progress and find the issues to fix to optimize our SEO. For some reason, in the SEOmoz reports it is the old URLs that show up. Unless the crawler does not actually crawl the pages and only uses the indexed pages to generate its report, I don't understand how could this possible. Anyone has a clue? When will the new URLs be indexed by SEOmoz and the major search engines? Thanks for your help!
Moz Pro | | Gestisoft-Qc0