Getting More Pages Indexed
-
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed.
Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed.
-
I think that is what did it! lol
-
Yes, you will need internal links to establish your site navigation. Then, external links if you don't have enough PR flow from within your site.
Some powerful sites can support these millions of pages with internal links. If you have a site like that congratulations!
-
Thanks this is helpful. I will work on funneling spiders. I assume I'll need a healthy dose of both internal and external links pointing deep into the site in order to get the spider to start chewing in there? Thanks.
-
Thanks! You enjoyed the chewing spiders?
-
Wow...that was a powerful answer EGOL. Thanks for adding the long answer. The way you worded it created a visualization for me that was very helpful to my understanding as a novice. Thanks.
-
The short answer...
Link deep into the site at multiple points with heavy PR.
The long answer...
If you have a really big site you need a lot of linkjuice to get the entire site indexed and keep it in the index. You also need a good site structure so that spiders can crawl through the site and find every page.
If you have several million pages, my guess is that you will need hundreds of links of at least PR5 or PR6 linking into the site. I would direct each of those links to a deep category page. That will funnel the spiders deep into your site and force them to chew their way out while indexing your pages.
All of those links must be held permanently in place. Because if you pull the links the flow of spiders will stop and google will slowly forget about pages that are not visited by spiders on a regular basis.
If you have weak links or not enough links your site will not be thoroughly crawled and google will forget about your pages as fast as they are discovered.
Big sites require a PR resource.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Use Internal Search pages as Landing Pages?
Hi all Just a general discussion question about Internal Search pages and using them for SEO. I've been looking to "noindexing / follow" them, but a lot of the Search pages are actually driving significant traffic & revenue. I've over 9,000 search pages indexed that I was going to remove, but after reading this article (https://www.oncrawl.com/technical-seo/seo-internal-search-results/) I was wondering if any of you guys have had success using these pages for SEO, like with using auto-generated content. Or any success stories about using the "noindexing / follow"" too. Thanks!
Technical SEO | | Frankie-BTDublin0 -
How do I prevent duplicate page title errors from being generated by my multiple shop pages?
Our e-commerce shop has numerous pages within the main shop page. Users navigate through the shop via typical pagination. So while there may be 6 pages of products it's all still under the main shop page. Moz keeps flagging my shop pages as having duplicate titles (ie shop page 2). But they're all the same page. Users aren't loading unique pages each time they go to the next page of products and they aren't pages I can edit. I'm not sure how to prevent this issue from popping up on my reports.
Technical SEO | | NiteSkirm0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Mobile site not getting indexed
My site is www.findyogi.com - a shopping comparison site The mobile site is hosted at m.findyogi.com I fixed my sitemap and attribution to mobile site in May last week. My mobile site pages are getting de-indexed since then. Website - www.findyogi.com/mobiles/motorola/motorola-moto-g-16gb-b95ef8/price - indexed Mobile - m.findyogi.com/mobiles/motorola/motorola-moto-g-16gb-b95ef8/price - _not indexed. _ Google is crawling my website and mobile site normally. What am I am doing wrong?
Technical SEO | | namansr0 -
The number of pages indexed on Bing DROPPED significantly.
I haven't signed in to bing webmaster tool for a while. and I found that Bing is not indexing my site properly all of a sudden. IT DROPPED SIGNIFICANTLY Any idea why it is behaving this way? (please check the attachment) INg1o.png
Technical SEO | | joony20080 -
Getting Rid of Duplicate Page Titles After URL Structure Change
I've had all sorts of issues with google when they just dropped us on our head a few weeks ago. Google is crawling again after I made some changes, but they're still not ranking our content like they were so I have a few questions. I changed our url structure from /year/month/date/post-title to just /post-title and 301 redirected the old link structure to the new. When I look I see over 3000 duplicate title errors listing both versions of the url. 1. How do I get google to crawl the old url structure and recognize the 301 redirect and update the index? 2. Google is crawling the site again, but they're not ranking us like they were before. We're in a highly competitive category and I'm aware of that, but we've always been an authority in our niche. We have plenty of quality backlinks and often we're originators of the content which is then rewritten by a trillion websites everywhere. We're not the best at writing and titles, but we're working on it and this did not matter much to google previously as it was ranking us pretty highly on the front page and certainly ranking us over many sites that are ranking above us today. Some backlinks http://www.alexa.com/site/linksin/dajaz1.com A few examples - if you google twista gucci louis prada you'll see many of the sites who trackbacked to us since we premiered the song rank much higher than us. 3 weeks ago we were ranking above them. http://dajaz1.com/twista-gucci-louis-prada/ google search jadakiss consignment mixtape 3 weeks ago we were ranking higher than all 4 sites ranking above us. The sites ranking above us even link to us or mention us, yet they rank above us now. original content here http://dajaz1.com/watch-jadakiss-confirms-cosignment-mixtape-2012-schedule/ I could throw out a ton of examples like this. How do we get google to rank us again. It should be noted that I'm not using any SEO plugin's on the site. I hand coded what's in there, and I know I can probably do it better so any tips or ideas is welcome. I'm pretty sure that our issues were caused by the Yoast SEO Plugin as when I search site:dajaz1.com the pages and topics that display were all indexed while the plugin was active. I've since removed it and all calls to it in the database, but I'm pretty nervous about plugins right now. Which brings me to my third and final question How do I get rid of the page category and topic pages that were indexed and seem to be ranking higher than the rest of our content? I lied one more. For category url I've set it to remove the category base so the url is dajaz1.com/news or dajaz1.com/music is that preferable or is this causing me issues? Any feedback is appreciated. Also google is crawling again (see attached image) but the Kilobytes downloaded per day hasn't. Should I be concerned about this? Gd9i6
Technical SEO | | malady0 -
Https indexed - though a no index no follow tag has been added
Hi, The https-pages of our booking section are being indexed by Google. We added But the pages are still being indexed. What can I do to exclude these URL's from the Google index? Thank you very much in advance! Kind regards, Dennis Overbeek ACSI Publishing | dennis@acsi.eu
Technical SEO | | SEO_ACSI0 -
Rel canonical or 301 the Index Page?
Still a bit confused on best practice for /index.php showing up as duplicate for www.mysite.com. What do I need to do and How?
Technical SEO | | bozzie3110