Why extreme drop in number of pages indexed via GWMT sitemaps?
-
Any tips on why our GWMT Sitemaps indexed pages dropped to 27% of total submitted entries (2290 pages submitted, 622 indexed)? Already checked the obvious Test Sitemap, valid URLs etc. We had typically been at 95% of submitted getting indexed.
-
Thanks, that coves it!
-
Yes, this is the norm. You will generally have a variety of update frequencies in your xml sitemap. If you look at your sitemap you will usually see a value from 0.1 to 1.0. Those request the frequency in which the page is updated. If Googlebot will generally adhere to your guidelines and only crawl those pages when you tell them they are updated. If all of your pages are set to the same frequency, which they shouldn't be, Google will generally only crawl a certain amount of data on your site on a given crawl. So, a slow increase in indexed pages is the norm.
-
Yes, looking back at change logs was helpful. Canonical tags was it! We found a bug, the canonical page tags were being truncated at 8 characters. The number of pages indexed has started to increase rather than decrease, so it appears the issue is resolved. But I would have thought the entire sitemap would get indexed once the issue was resolved, rather than small increases each day. Does that seem correct to have a slow increase back to normal, rather than getting back to nearly 100% indexed overnight?
-
Do you have the date of the change? Try to see if you can see the when the change happened because we might be able to figure it out that way too.
WMT > sitemaps > webpages tab
Once you find the date you may be able to go through your notes and see if you've done anything around that date or if Google had any sort of update (PageRank just updated).
I have had sites that had pages unindexed and then a few crawls later it got reindexed. I just looked at 20 sites in our WMT and all of our domains look good as far as percentage of submitted vs indexed.
Only other things I can think of is to check for duplicate content, canonical tags, noindex tags, pages with little or no value (thin content) and (I've done this before) keep your current sitemap structure but add an additional sitemap with all of your pages and posts to it. Don't break it down, just add it all to one sitemap. I've had that work before for a similar issue but that was back in 2010. Multiple sitemaps for that site never seemed to work out. Having it all on one did the trick. The site was only about 4,000 pages at the time but I thought I would mention it. I haven't been able to duplicate the error and no other site has had that problem but that did do the trick.
Definitely keep an eye on it over the next few crawls. Please let us know what the results are and what you've tried so we can help troubleshoot.
-
We use multiple site maps.
Thanks, I had not thought about page load speed. But it turned up okay. Had already considered your other suggestions. Will keep digging. Appreciate your feedback. -
Not sure why the drop but are you using just one sitemap or do you have multiple ones?
Check the sizes of your pages and the crawl rate that Google is crawling your site. If they have an issue with the time it takes them to crawl your sitemap, it will start to reduce the number of indexed pages it serves up. You can check your crawl stats by navigating to WMT, crawl > crawl stats. Check to see if you've notice any delays in the numbers.
Also, make sure that your robots.txt isn't blocking anything.
Have you checked your site with a site: search?
These are pretty basic stuff but let us know what you've looked into so we can help you more. Thanks.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I need to remove pages that don't get any traffic from the index?
Hi, Do I need to remove pages that don't get any traffic from the index? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Page is an A but does not rank extremely good… Any ideas?
Hi! My page werkzeug-kasten.com is not ranking the way it should for "Webdesign Freiburg" on google Germany. Although it receives an A it is only seen on page 2 although the competition is not that hard. Do you have any ideas why that is and what I could improve? Best regards Marc
Intermediate & Advanced SEO | | RWW0 -
Can you no index a page in Wordpress from just Google news?
I'm trying to find a plugin for Wordpress that enables you to no-index an individual page from Google news but not from Google search results. We want to remove some of our pages from Google news without hurting others.
Intermediate & Advanced SEO | | uSw0 -
Does Google still don't index Hashtag Links ? No chance to get a Search Result that leads directly to a section of a page? or to one of numeras Hashtag Pages in a single HTML page?
Does Google still don't index Hashtag Links ? No chance to get a Search Result that leads directly to a section of a page? or to one of numeras Hashtag Pages in a single HTML page? If I have 4 or 5 different hashtag link section pages , consolidated into one HTML Page, no chance to get one of the Hashtag Pages to appear as a search result? like, if under one Single Page Travel Guide I have two essential sections: #Attractions #Visa no chance to direct search queries for Visa directly to the Hashtag Link Section of #Visa? Thanks for any help
Intermediate & Advanced SEO | | Muhammad_Jabali0 -
Ecommerce SEO - Indexed product pages are returning 404's due to product database removal. HELP!
Hi all, I recently took over an e-commerce start-up project from one of my co-workers (who left the job last week). This previous project manager had uploaded ~2000 products without setting up a robot.txt file, and as a result, all of the product pages were indexed by Google (verified via Google Webmaster Tool). The problem came about when he deleted the entire product database from our hosting service, godaddy and performed a fresh install of Prestashop on our hosting plan. All of the created product pages are now gone, and I'm left with ~2000 broken URL's returning 404's. Currently, the site does not have any products uploaded. From my knowledge, I have to either: canonicalize the broken URL's to the new corresponding product pages, or request Google to remove the broken URL's (I believe this is only a temporary solution, for Google honors URL removal request for 90 days) What is the best way to approach this situation? If I setup a canonicalization, would I have to recreate the deleted pages (to match the URL address) and have those pages redirect to the new product pages (canonicalization)? Alex
Intermediate & Advanced SEO | | byoung860 -
404 with a Javascript Redirect to the index page...
I have a client that is wanting me to issue a 404 on her links that are no longer valid to a custom 404, pause for 10 seconds, then rediirect to the root page (or whatever other redirect logic she wants)...to me it seems trying to game googlebot this way is a "bad idea" Can anyone confirm/deny or offer up a better suggestion?
Intermediate & Advanced SEO | | JusinDuff0 -
High number of items per page or low number with more category pages?
In SEO terms, what would be the best method: High number of items per page or low number with more pages? For example, this category listing here: http://flyawaysimulation.com/downloads/90/fsx-civil-aircraft/ It has 10 items per page. Would there be any benefit of changing a listing like that to 20 items in order to decrease the number of pages in the category? Also, what other ways could you increase the SEO of category listings like that?
Intermediate & Advanced SEO | | Peter2640 -
Can a XML sitemap index point to other sitemaps indexes?
We have a massive site that is having some issue being fully crawled due to some of our site architecture and linking. Is it possible to have a XML sitemap index point to other sitemap indexes rather than standalone XML sitemaps? Has anyone done this successfully? Based upon the description here: http://sitemaps.org/protocol.php#index it seems like it should be possible. Thanks in advance for your help!
Intermediate & Advanced SEO | | CareerBliss0