Why extreme drop in number of pages indexed via GWMT sitemaps?
-
Any tips on why our GWMT Sitemaps indexed pages dropped to 27% of total submitted entries (2290 pages submitted, 622 indexed)? Already checked the obvious Test Sitemap, valid URLs etc. We had typically been at 95% of submitted getting indexed.
-
Thanks, that coves it!
-
Yes, this is the norm. You will generally have a variety of update frequencies in your xml sitemap. If you look at your sitemap you will usually see a value from 0.1 to 1.0. Those request the frequency in which the page is updated. If Googlebot will generally adhere to your guidelines and only crawl those pages when you tell them they are updated. If all of your pages are set to the same frequency, which they shouldn't be, Google will generally only crawl a certain amount of data on your site on a given crawl. So, a slow increase in indexed pages is the norm.
-
Yes, looking back at change logs was helpful. Canonical tags was it! We found a bug, the canonical page tags were being truncated at 8 characters. The number of pages indexed has started to increase rather than decrease, so it appears the issue is resolved. But I would have thought the entire sitemap would get indexed once the issue was resolved, rather than small increases each day. Does that seem correct to have a slow increase back to normal, rather than getting back to nearly 100% indexed overnight?
-
Do you have the date of the change? Try to see if you can see the when the change happened because we might be able to figure it out that way too.
WMT > sitemaps > webpages tab
Once you find the date you may be able to go through your notes and see if you've done anything around that date or if Google had any sort of update (PageRank just updated).
I have had sites that had pages unindexed and then a few crawls later it got reindexed. I just looked at 20 sites in our WMT and all of our domains look good as far as percentage of submitted vs indexed.
Only other things I can think of is to check for duplicate content, canonical tags, noindex tags, pages with little or no value (thin content) and (I've done this before) keep your current sitemap structure but add an additional sitemap with all of your pages and posts to it. Don't break it down, just add it all to one sitemap. I've had that work before for a similar issue but that was back in 2010. Multiple sitemaps for that site never seemed to work out. Having it all on one did the trick. The site was only about 4,000 pages at the time but I thought I would mention it. I haven't been able to duplicate the error and no other site has had that problem but that did do the trick.
Definitely keep an eye on it over the next few crawls. Please let us know what the results are and what you've tried so we can help troubleshoot.
-
We use multiple site maps.
Thanks, I had not thought about page load speed. But it turned up okay. Had already considered your other suggestions. Will keep digging. Appreciate your feedback. -
Not sure why the drop but are you using just one sitemap or do you have multiple ones?
Check the sizes of your pages and the crawl rate that Google is crawling your site. If they have an issue with the time it takes them to crawl your sitemap, it will start to reduce the number of indexed pages it serves up. You can check your crawl stats by navigating to WMT, crawl > crawl stats. Check to see if you've notice any delays in the numbers.
Also, make sure that your robots.txt isn't blocking anything.
Have you checked your site with a site: search?
These are pretty basic stuff but let us know what you've looked into so we can help you more. Thanks.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to index your website pages on Google 2020 ?
Hey! Hopefully, everyone is fine here I tell you some step how you are index your all website pages on Google 2020. I'm already implementing these same steps for my site Boxes Maker. Now Below I'm giving you some steps for indexing your website pages. These are the most important ways to help Google find your pages: Add a sitemap. ... Make sure people know your site. ... Ensure full navigation on your site. ... Apply the indexing application to your homepage. ... Sites that use URL parameters other than URLs or page names may be more difficult to broadcast.
Intermediate & Advanced SEO | | fbowable0 -
Magento 1.9 SEO. I have product pages with identical On Page SEO score in the 90's. Some pull up Google page 1 some won't pull up at all. I am searching for the exact title on that page.
I have a website built on Magento 1.9. There are approximately 290,000 part numbers on the site. I am sampling Google SERP results. About 20% of the keywords show up on page 1 position 5 thru 10. 80% don't show up at all. When I do a MOZ page score I get high 80's to 90's. A page score of 89 on one part # may show up on page one, An identical page score on a different part # can't be found on Google. I am searching for the exact part # in the page title. Any thoughts on what may be going on? This seems to me like a Magento SEO issue.
Intermediate & Advanced SEO | | CTOPDS0 -
Spam Score - how to fix No Contact Info and Low Number of Pages Found
Hi We recently change the layout of our website to a responsive theme in the hope of improve our rankings. We are getting less more traffic but less conversions to sales. Our Spam sore has these flagged No Contact Info - we have the phone number on top of every page and a contact us link at the bottom of every page - Is there something we are missing ? Low Number of Pages Found - we have over 3000 products each of which has a page, plus other info pages on our stite - Whay would this be flagged.RegardsAdrienne
Intermediate & Advanced SEO | | CostumeD0 -
Should I set up no index no follow on low quality pages?
I know it is a good idea for duplicate pages, blog tags, etc. but I remember somewhere that you can help the overall link juice of a website by adding no index no follow or no index follow low quality content pages of your website. Is it still a good idea to do this or was it never a good idea to begin with? Michael
Intermediate & Advanced SEO | | Michael_Rock0 -
Sort term product pages and fast indexing - XML sitemaps be updated daily, weekly, etc?
Hi everyone, I am currently working on a website that the XML sitemap is set to update weekly. Our client has requested that this be changed to daily. The real issue is that the website creates short term product pages (10-20 days) and then the product page URL's go 404. So the real problem is quick indexing not daily vs weekly sitemap. I suspect that daily vs weekly sitemaps may help solve the indexing time but does not completely solve the problem. So my question for you is how can I improve indexing time on this project? The real problem is how to get the product pages indexed and ranking before the 404 page shows u?. . Here are some of my initial thoughts and background on the project. Product pages are only available for 10 to 20 days (Auction site).Once the auction on the product ends the URL goes 404. If the pages only exist for 10 to 20 days (404 shows up when the auction is over), this sucks for SEO for several reasons (BTW I was called onto the project as the SEO specialist after the project and site were completed). Reason 1 - It is highly unlikely that the product pages will rank (positions 1 -5) since the site has a very low Domain Authority) and by the time Google indexes the link the auction is over therefore the user sees a 404. Possible solution 1 - all products have authorship from a "trustworthy" author therefore the indexing time improves. Possible solution 2 - Incorporate G+ posts for each product to improve indexing time. There is still a ranking issue here since the site has a low DA. The product might appear but at the bottom of page 2 or 1..etc. Any other ideas? From what I understand, even though sitemaps are fed to Google on a weekly or daily basis this does not mean that Google indexes them right away (please confirm). Best case scenario - Google indexes the links every day (totally unrealistic in my opinion), URL shows up on page 1 or 2 of Google and slowly start to move up. By the time the product ranks in the first 5 positions the auction is over and therefore the user sees a 404. I do think that a sitemap updated daily is better for this project than weekly but I would like to hear the communities opinion. Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
Google indexing "noindex" pages
1 weeks ago my website expanded with a lot more pages. I included "noindex, follow" on a lot of these new pages, but then 4 days ago I saw the nr of pages Google indexed increased. Should I expect in 2-3 weeks these pages will be properly noindexed and it may just be a delay? It is odd to me that a few days after including "noindex" on pages, that webmaster tools shows an increase in indexing - that the pages were indexed in other words. My website is relatively new and these new pages are not pages Google frequently indexes.
Intermediate & Advanced SEO | | khi50 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
404'd pages still in index
I recently launched a site and shortly after performed a URL rewrite (not the greatest idea, i know). The developer 404'd the old pages instead of a permanent 301 redirect. This caused a mess in the index. I have tried to use Google's removal tool to remove these URL's from the index. These pages were being removed but now I am finding them in the index as just URL's to the 404'd page (i.e. no title tag or meta description). Should I wait this out or now go back and 301 redirect the old URL's (that are 404'd now) to the new URL's? I am sure this is the reason for my lack of ranking as the rest of my site is pretty well optimized and I have some quality links.
Intermediate & Advanced SEO | | mj7750