Changes in Sitemap Indexation in GWT?
-
I've noticed some significant changes in the number and percentage of indexed URLs for the sitemaps we've been submitting to Google. I've been tracking these numbers directly from Google Webmaster Tools>Site Configuration>Sitemaps. We've made some changes that could be causing the changes we're seeing, but I want to confirm that this wasn't just a change in the way Google reports the indexation.
Has anyone else noticed major changes, greater than a 30% change, in the indexation of your sitemaps in the past week?
Thanks,
Joe
-
Hey Joe,
I noticed the same thing (large drop in indexed number/percentage of pages) a few days ago and thought that was related to a ranking drop I have experienced. All I did was resubmit all (tag, category, posts) sitemaps via GWT and the next day Google had almost 100% of my submitted URLs back in the index. However, that didn't fix my apparent Google penalty issue so I'm still searching for a solution:
http://www.seomoz.org/q/help-with-diagnosing-google-penalty
Oh, this was with a wordpress site utilizing Yoast's WP SEO plugin to create XML sitemaps.
-
I use the same manual approach, manually dumping into Excel on a regular basis. I'll usually increase the frequency if we are making changes that should effect indexation.
-
Just the past week i am not too sure. Out of curiosity, how do you track your indexed pages? Right now i dump manually into excel for historical records and tracking, which lends to not doing it more than every month or two. You just eyeballing weekly and notice changes?
-
Sorry I forgot to specify that this has been over the past week.
Thanks for the reply, Ryan.
-
What time frame are you talking here?
Edit: I manage about 25 different sites, and I have seen a large fluctuation (increase for the most part) since April/May.I have seen upwards of 50%. Most of my flucuation i think was due do some good site/url structure and SEO work i did a while ago, and i gather that panda & google have liked some of the changes (about freakin time). It has not been due to an increased content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PDFs With No Index Contribute To Page Ranks?
I have a question I'm hoping you can help me with. If I upload a PDF and add a no index under the meta robots index so that the PDF doesn't appear in search results when I send people the link to this PDF, does it still contribute to my site traffic/ranking etc? Basically we are deciding whether to put some PDFs with pricing options etc onto our website or on a google drive. We will be sending the links to potential clients. If visitors clicking on the link would still help with increasing traffic and increasing our google rank (without that PDF showing in results) we thought this might be the best solution.
Algorithm Updates | | whiterabbitnz0 -
Have you ever seen or experienced a page indexed which is actually from a website which is blocked by robots.txt?
Hi all, We use robots file and meta robots tags for blocking website or website pages to block bots from crawling. Mostly robots.txt will be used for website and expect all the pages to not getting indexed. But there is a condition here that any page from website can be indexed by Google even the site is blocked from robots.txt; because crawler may find the page link somewhere on internet as stated here at last paragraph. I wonder if this really the case where some webpages have got indexed. And even we use meta tags at page level; do we need to block from robots.txt file? Can we use both techniques at a time? Thanks
Algorithm Updates | | vtmoz0 -
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
A client asked: "Are you guys aware of any recent changes to Google noquery traffic? I am seeing some chatter around this." Is he referring to "not provided" traffic?
I'm not sure what my client means by this question. I assume he's talking about "not provided" traffic. Is there something I'm missing? Thanks for reading!
Algorithm Updates | | DA20130 -
Are xml sitemaps a thing of the past?
We had an internal debate about the importance of having a sitemap.xml on your website. Basically, there is Google documentation that indicates a sitemap.xml is due diligence: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156184 And other authoritative forums, blogposts, etc. which indicate that sitemap creation and maintenance is a waste of your time, e.g. http://webmasters.stackexchange.com/questions/4803/the-sitemap-paradox/ A bigger question is: Are there cases in which not having a sitemap.xml actually became detrimental or risky? Thanks in advance!
Algorithm Updates | | HZseo0 -
Google indexing my website's Search Results pages. Should I block this?
After running the SEOmoz crawl test, i have a spreadsheet of 11,000 urls of which 6381 urls are search results pages from our website that have been indexed. I know I've read that /search should be blocked from the engines, but can't seem to find that information at this point. Does anyone have facts behind why they should be blocked? Or not blocked?
Algorithm Updates | | Jenny10 -
Any ideas why our category pages got de-indexed?
Hi all, I work for evenues, a directory website that provides listings of meeting rooms and event spaces. Things seemed to be chugging along nicely with our link building effort (mostly through guest blogging using a variety of anchor text). Woke up on Monday morning to find that our City pages have been de-indexed. This page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington used to be at the top of page #2 in the SERPs for the keyword "Meeting Rooms in Seattle" I doubt that we got de-indexed because of our link building efforts, as it was only a few blog posts and links from profile pages on community websites. My guess is that when we did a recent 2.0 release of the site, there are now several "filters" or subcategory pages with latitude and longitude parameters in the URL + different page titles based on the categories like: "Meeting Rooms and Event Spaces in Seattle" --Main Page "Meeting Rooms in Seattle" "Classroom Venues in Seattle" "Party Venues in Seattle" There was a bit of pushback when I suggested that we do a rel="canonical" on these babies because ideally we'd like to rank for all 4 queries (Meeting Rooms, Party Venues, Classrooms, in City). These are new changes, and I have a sneaking suspicion this is why we got de-indexed. We're presenting generally the same content. Thoughts?
Algorithm Updates | | eVenuesSEO0 -
Why bing is not indexing our website?
We are up almost a six month already, google indexed 46,900 pages. We have decent traffic and a lot of real external links to us. No single page has been indexed by bing or yahoo. I have submitted sitemap to bing's webmaster tool two weeks ago and still it is in Pending stage. here is our address: www.showme.com and here is site map: http://www.showme.com/sitemapxml.php What can be the reason of that? Thanks for your help. Karen Bdoyan
Algorithm Updates | | showme0