I put a crawl on my site via seomoz and has come back saying only 1 page has been crawled.It has been over a week now can anyone help?
-
Crawling the pages of site
-
Thanks Keri,
Very much appreciated...
-
Hi Jay,
Here's a page the help desk wrote about why your site may only have one page crawled. Your best bet is to read that, then if you're still having problems, open up a ticket with the help desk either via the web or by sending an email to help@seomoz.org.
http://seomoz.zendesk.com/entries/409821-why-isn-t-my-site-being-crawled-you-only-crawled-one-page
Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
2 canonical links on 1 page, 1 for print version
Our developer has added a 2nd canonoical link for the "print" version of our page. I read on another post that this appears to be not be the correct way to do this. Is there a better way ? Here is an example of the code:
Moz Pro | | foodsleuth0 -
How can a site not indexed on google still have 40+ Page Authority on OpenSiteExplorer?
Hey guys, I was revisiting the backlinks to my site and found a few that were not indexed on Google anymore. I confirmed this by typing site:"SiteAddress" in google and it retuned 0 result. Yet when i searched the same site on OSE (OpenSiteExplorer) yielded me a PA of over 40. I used to research sites on OSE to see if they are worth pursuing for a backlink but i am second guessing this because of this recent finding. Can someone please shed some light on this? Thanks!
Moz Pro | | MH-Seonoob0 -
Can I add another user to my SEOMoz Pro campaign?
I have a couple of campaigns on my SEOMoz Pro account and I would like to give access of one of theses campaigns to a SEO consultant, is that possible? Or do I need to give access to my full SEOMoz account?
Moz Pro | | bernardovailati3 -
When will be the 250 pages crawled limit eliminated?
Hi, I signed up yesterday for a SEOMoz Pro Account, and would like to know, please, when will be the 250 pages crawled limit eliminated? 🙂 Thanks in advance for your help!
Moz Pro | | Andarilho0 -
How do I find the corresponding duplicate content pages from my SEOmoz report?
Once I have run my report and the duplicate content pages come up, is there a way to find out which pages have the duplicate content on them? I have one URL but where can I find the duplicate content that corresponds to it? Thanks Barry
Moz Pro | | MrBarrytg0 -
Seomoz on-page analysis, how strict to be
Hello, In a competitive niche, how important is it to be strict with the seomoz on-page analysis? If it gives a page/keyword an A, am I good to go? Or do I need to be more strict in that. We've had some competition move above us and we want to make sure we're on-site optimized well. site: nlpca(dot)com Thanks.
Moz Pro | | BobGW0