Working out exactly how Google is crawling my site if I have loooots of pages
-
I am trying to work out exactly how Google is crawling my site including entry points and its path from there. The site has millions of pages and hundreds of thousands indexed. I have simple log files with a time stamp and URL that google bot was on. Unfortunately there are hundreds of thousands of entries even for one day and as it is a massive site I am finding it hard to work out the spiders paths. Is there any way using the log files and excel or other tools to work this out simply? Also I was expecting the bot to almost instantaneously go through each level eg. main page--> category page ---> subcategory page (expecting same time stamp) but this does not appear to be the case. Does the bot follow a path right through to the deepest level it can/allowed to for that crawl and then returns to the higher level category pages at a later time? Any help would be appreciated
Cheers
-
Can you explain to me how you did your site map for this please?
-
I've run into the same issue for a site with 40 k + pages - far from your overall page # but still .. maybe it's the same flow overall.
The site I was working on had a structure of about 5 level deep. Some of the areas within the last level were out of reach and they didn't get indexed. More then that even a few areas on level 2 were not present in the google index and the google boot didn't visit those either.
I've created a large xml site map and a dynamic html sitemap with all the pages from the site and submit it via webmaster tool (the xml sitemap that is) but that didn't solve the issue and the same areas were out of the index and didn't got hit. Anyway the huge html sitemap was impossible to follow from a user point of view so I didn't keep that online for long but I am sure it can't work that way either.
What i did that finally solved the issue was to spot the exact areas that were left out, identify the "head" of those pages - that means several pages that acted as gateway for the entire module and I've build a few outside links that pointed to those pages directly and a few that were pointed to main internal pages of those modules that were left out.
Those pages gain authority fast and only in a few days we've spotted the google boot staying over night
All pages are now indexed and even ranking well.
If you can spot some entry pages that can conduct the spider to the rest you can try this approach - it should work for you too.
As far as links I've started with social network links, a few posts with links within the site blog (so that means internal links) and only a couple of outside links - articles with content links for those pages. Overall I think we are talking about 20-25 social network links (twitter, facebook, digg, stumble and delic), about 10 blog posts published in a 2-3 days span and about 10 articles in outside sources.
Since you have a much larger # as far as pages you probably will need more gateways and that means more links - but overall it's not a very time consuming session and it can solve your issue... hopefully
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site structure for location + services pages
We are in the process of restructuring our site and are trying to figure out Google's preference for location pages and services. Let's say we are an auto repair company with lots of locations and each one of them offer some unique services, while other services are offered by all or most other locations. Should we have a global page for each service live with a link to the location page for each shop that offers that service? OR Should we built a unique page about each service for every location as a subfolder of each location (essentially creating a LOT of sub pages because each location has 15-20 services. Which will rank better?
Intermediate & Advanced SEO | | MJTrevens1 -
External resources page (AKA a satellite site) - is it a good idea?
So the general view on satellite sites is that they're not worth it because of their low authority and the amount of link juice they provide. However, I have an idea that is slightly different to the standard satellite site model. A client's website is in a particular niche, but a lot of websites that I have identified for potential links are not interested because they are a private commercial company. Many are only interested in linking to charities or simple resource pages. I created a resource section on the website, but many are still unwilling to link to it as it is still part of a commercial website. The website is performing well and is banging on the door of page one for some really competitive keywords. A few more links would make a massive difference. One idea I have is to create a standalone resource website that links to our client's website. This would be easy to get links from sites that would flat out refuse to link to the main website. This would increase the authority of the resource and result in more link juice to the primary website. Now I know that the link juice from this website will not be as good as getting links directly to the primary website, but would it still be a good idea? Or would my time be better spent trying to get a handful of links directly to the client's website? Alternatively, I could set up a sub-domain to set up the resource, but I'm not sure that this would be as successful.
Intermediate & Advanced SEO | | maxweb0 -
Google Manual Penalty - Unnatural Links FROM My Site - Where?
Hi Mozzers, I've just received a manual penalty for one of my websites. The penalty is for 'unnatural links from my site which I find disturbing because I can't see that anything really wrong with it. The website is www.lighting-tips.co.uk - its a pretty new blog (only 6-7 posts) and whilst I've allowed guest posting I'm being very careful that the content is relevant and good quality. I'm only allowing 1 - 2 links and very few with proper anchor text so I'm wondering what has been done so wrong that I'm getting this manual penalty? Am I missing something here? Thanks in advance. Aaron
Intermediate & Advanced SEO | | AaronGro0 -
Google + pages and SEO results...
Hi, Can anyone give me insight into how people are getting away with naming their business by the SEO search term, creating a BS Google + page, then having that page rank high in the search results. I am speaking specifically about the results you get when you Google: "Los Angeles DUI Lawyer". As you can see from my attached screenshot (I'm doing the search in Los Angeles), the FIRST listing is a Google + business. Strangely, the phone number listed doesn't actually take you to a DUI attorney, but rather to some marketing group that never answers the phone. Can anyone give me insight into why Google even allows this? I just find it odd that Google cares so much about the user experience, but have the first result be something completely misleading. I know it sounds like I'm just jealous (which I am, a little), but I find it disheartening that we work so hard on SEO, and someone takes the top spot with an obvious BS page. UupqBU9
Intermediate & Advanced SEO | | mrodriguez14400 -
Do search engines crawl links on 404 pages?
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
Intermediate & Advanced SEO | | brad-causes0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
When I try creating a sitemap, it doesnt crawl my entire site.
We just launched a new Ruby app at (used to be a wordpress blog) - http://www.thesquarefoot.com We have not had time to create an auto-generated sitemap, so I went to a few different websites with free sitemap generation tools. Most of them index up to 100 or 500 URLS. Our site has over 1,000 individual listings and 3 landing pages, so when I put our URL into a sitemap creator, it should be finding all of these pages. However, that is not happening, only 4 pages seem to be seen by the crawlers. TheSquareFoothttp://www.thesquarefoot.com/http://www.thesquarefoot.com/users/sign_inhttp://www.thesquarefoot.com/searchhttp://www.thesquarefoot.com/renters/sign_upThis worries me that when Google comes to crawl our site, these are the only pages it will see as well. Our robots.txt is blank, so there should be nothing stopping the crawlers from going through the entire site. Here is an example of one of the 1,000s of pages not being crawled****http://www.thesquarefoot.com/listings/Houston/TX/77098/Central_Houston/3910_Kirby_Dr/Suite_204Any help would be much appreciated!
Intermediate & Advanced SEO | | TheSquareFoot0 -
How can we get a site reconsidered for Google indexing?
We recently completed a re-design for a site and are having trouble getting it indexed. This site may have been penalized previously. They were having issues getting it ranked and the design was horrible. Any advise on how to get the new site reconsidered to get the rank where it should be? (Yes, Webmaster Tools is all set up with the sitemap linked) Many thanks for any help with this one!
Intermediate & Advanced SEO | | d25kart0