Roger bot taking a long time to crawl site
-
Hi all, I've noticed Roger bot is taking a long time to crawl my new site. It started on the 28th Feb 2013 and is still going. There aren't many pages at the moment. Any ideas please?
thanks a lot, Mark.
-
Hi Peter
thanks for your reply. The crawl has now completed and given me some more areas to work on, it's a great tool.
I was so preoccupied with 'hiding' the site over the last couple of months with the easy code:
User-agent: * Disallow: /
I hadn't thought beyond this.
I've noticed Google has now recognised the new robots.txt which has allowed the sitemap to be accepted..
I'll look at your notes, thank you, and work out my next move. I'll let you know how I get on too.
I know (well think) I have to get noindex, follow for 'sorted' category pages...
all the best, Mark.
-
Hi Mike
The crawl has now completed, thank you. I think the results will keep me occupied
all the best, Mark.
-
Hi Mark,
Sorry it's taking a while to crawl your new site.
While I'm not exactly sure what the delay is, one of the possible reasons is through your robots.txt. Here's what I see in a short snippet from your robots.txt:
# Crawlers Setup User-agent: * Crawl-delay: 30 # Allowable Index Allow: /*?p= Allow: /index.php/blog/ Allow: /catalog/seo_sitemap/category/ Allow: /catalogsearch/result/ Allow: /media/ # Directories Disallow: /404/ Disallow: /app/ Disallow: /cgi-bin/ Disallow: /downloader/ Disallow: /errors/ Disallow: /includes/ Disallow: /js/ Disallow: /lib/ Disallow: /magento/ Disallow: /pkginfo/ Disallow: /report/ From here, the formatting looks a little awkward. What's going on is that you're telling Roger bot to only look at these:
Allowable Index
Allow: /*?p=
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/
Allow: /catalogsearch/result/
Allow: /media/While the syntax is OK, not every crawler out there will follow the allow directive. Here's an example something you can use.
# Crawlers Setup User-agent: * Crawl-delay: 30 Disallow: / Disallow: /404/ Disallow: /app/ Disallow: /cgi-bin/ Disallow: /downloader/ Disallow: /errors/ Disallow: /includes/ Disallow: /js/ From here you're telling the crawler to disallow nothing except these directories. Please let us know once you implement this method is that will actually fix the crawl. Thanks for reaching out! Best, Peter Li SEOmoz Help Team ```
-
Hi Mark,
This sounds like a bug or issue with the SEOmoz software.
Contact help@seomoz.org and ask one of the help associates to look into this for you.
If you do not have many pages, it definitely shouldn't take that long.
The help team responds extremely quickly!
Good luck.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does it take to rank easy keywords?
I have an established site with low keyword ranking and the keyword I am wanting to rank for it rated below 10 on Moz. It has been a few days since I published the article.
Technical SEO | | Begbie20060 -
Redirecting .edu subdomains to our site or taking the link, what's more valuable?
We have a relationship built through a service we offer to universities to be issued a .edu subdomain that we could redirect to our landing page relevant to that school. The other option is having a link from their website to that same page. My first question is, what would be more valuable? Can you pass domain authority by redirecting a subdomain to a subdirectory in my root domain? Or would simply passing the link equity from a page in their root domain to our page pass enough value? My second question is, if creating a subdomain with a redirect is much more valuable, what is the best process for this? Would we simply have their webmaster create the subdomain for us an have them put a 301 redirect to our page? Is this getting in the greyer hat area? Thanks guys!
Technical SEO | | Dom4410 -
Partner Sites
Hi All, Within our company we have a media group that publishes magazines and videos, the sites have footers that link to our shopping site, one of them has 118,459 links to one URL, domain authority 23, and the other 17,726 to seven URLs, domain authority 52, (there are some articles which link organically). My question is are these links because they're from identifiable companies with the same ownership worth keeping or are they detrimental? The site being linked to has a DA of 39 Cheers Stew
Technical SEO | | StewMcG0 -
How long after google crawl do you need 301 redirects
We have just added 301's when we moved our site. Google has done a crawl & spat back a few errors. How long do I need to keep those 301's in place? I may need to change some. Thanks
Technical SEO | | Paul_MC0 -
Can we use our existing site content on new site?
We added 1000s of pages unique content on our site and soon after google release penguin and we loose our ranking for major keywords and after months of efforts we decided to start a new site. If we use all the existing site content on new domain does google going to penalized the site for duplicate content or it will be treated as unique? Thanks
Technical SEO | | mozfreak0 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0 -
How do you find bad links to your site?
My website has around 900 incoming links and I have a Google 50 penalty that is sitewide. I have been doing research and from what I can see is that the 50 penalty is usually associated with scetchy links. The penalty started last year. I had about 40 related domains to my main site and each had a simple one page site with a link to the main site. (I know I screwed up) I cleaned up all of those links by removing them. The single page site still exist, but they have no links and several of them still rank very well. I also had an outside SEO person that bought a few links. I came clean with Google and told them everything. I gave them all of my sites and that the SEO person had bought links. I gave them full disclosure and removed everything. I have one site that I can't get the link removed from. I have contacted them numerous times to remove the link and I get no response. I am curious if anyone has had a simular experience and how they corrected the situation. Another issue is that my site is "thin" because its an ecommerce affiliate site and full of affiliate links. I work in the costume market. I'm also afraid that I have other bad links pointing to my site. Dooes anyone know of a tool to identify bad links that Google may be penalizing me for at this time. Here is Google's latest denial of my reconsideration request. Dear site owner or webmaster of XXXXXXXXX.com. We received a request from a site owner to reconsider XXXXXXXX.com for compliance with Google's Webmaster Guidelines. We've reviewed your site and we believe that some or all of your pages still violate our quality guidelines. In order to preserve the quality of our search engine, pages from XXXXXXXXXX.com may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines. If you wish to be reconsidered again, please correct or remove all pages that are outside our quality guidelines. When such changes have been made, please visit https://www.google.com/webmasters/tools/reconsideration?hl=en and resubmit your site for reconsideration. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality
Technical SEO | | tadden0 -
Partial Site Move -- Tell Google Entire Site Moved?
OK this one's a little confusing, please try to follow along. We recently went through a rebranding where we brought a new domain online for one of our brands (we'll call this domain 'B' -- it's also not the site linked to in my profile, not to confuse things). This brand accounted for 90% of the pages and 90% of the e-comm on the existing domain (we'll call the existing domain 'A') . 'A' was also redesigned and it's URL structure has changed. We have 301s in place on A that redirect to B for those 90% of pages and we also have internal 301s on A for the remaining 10% of pages whose URL has changed as a result of the A redesign What I'm wondering is if I should tell Google through webmaster tools that 'A' is now 'B' through the 'Change of Address' form. If I do this, will the existing products that remain on A suffer? I suppose I could just 301 the 10% of URLs on B back to A but I'm wondering if Google would see that as a loop since I just got done telling it that A is now B. I realize there probably isn't a perfect answer here but I'm looking for the "least worst" solution. I also realize that it's not optimal that we moved 90% of the pages from A to B, but it's the situation we're in.
Technical SEO | | badgerdigital0