Google Webmaster tools: Sitemap.xml not processed everyday
-
Hi,
We have multiple sites under our google webmaster tools account with each having a sitemap.xml submitted
Each site's sitemap.xml status ( attached below ) shows it is processed everyday except for one
_Sitemap: /sitemap.xml__This Sitemap was submitted Jan 10, 2012, and processed Oct 14, 2013._But except for one site ( coed.com ) for which the sitemap.xml was processed only on the day it is submitted and we have to manually resubmit every day to get it processed.Any idea on why it might?thank you
-
My initial reaction was that this is more likely technical than something Google is doing - checking the load-time is a good idea. Make sure the sitemap validates and there's nothing odd about it. If you manually re-submit it, does it seem to take?
-
Just so I am clear, you have been waiting and finally Google processed it, or was it sitting there and someone took an action which caused Google to process it?
I am surprised that nothing happened for nearly two years. Has the site had traffic, etc.? Any warnings, manual actions, etc.?
Thanks
-
Thanks Paul,
On custom crawl setting, I just verified and they remain same same across all our sites.
Yes, sitemap is dynamic but rendered via cache which refreshed when new content get published, will check on the load time.
thank you
-
Thanks Robert,
All our sites are getting indexed for sure, but one site ( coed.com ) sitemap.xml on GWT says it was processed only on the day it submitted, while other sites ( collegecandy.com and bustedcoverage.com ) sitemap.xml GWT was getting processed everyday
Our sitemap.xml on all sites updates automatically when new content get published so I believe it has to be processed every day
-
Any chance there's been a custom crawl setting accidentally added to your Google Webmaster Tools, Robert? Some devs do this during development or it can happen accidentally.
Also, your sitmemap takes a ridiculously long time to load - well over 15 seconds for me and over 18 seconds using webpagetest.org. It could be that Google simply isn't waiting for the page to load when it tries to visit. If the sitemap's being generated dynamically, you may have a rendering problem. Otherwise there's something borked when a 50kb file takes that long.
Might also want to try submitting it through Bing Webmaster Tools and see if they are better able to index it consistently for comparison?
Bit of a head-scratcher. Hope that gives you a starting point.
Paul
-
Robert,
There is no screenshot attached, but I am unaware of sitemaps being processed daily by search engines. What are you trying to achieve by continuously resubmitting the sitemap?
The site is indexed, correct? And when you look at crawl stats it is showing the site being crawled on some semi regular basis? Google does not process your sitemap every day.
Hope that helps,
Robert
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Switching from HTTP to HTTPS and google webmaster
HI, I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well. Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all. I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file?? Any help and advice would be much appreciated. Kind regards Steve User-agent: *
Technical SEO | | lqz
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 ceLAHIv.jpg0 -
Webmaster Tools Search Queries Data Drop
Hi I'm seeing a significant drop in search queries being reported for a client in GWT starting on the 7th Feb. I have seen a few articles on SERound Table etc saying that many are reporting probs like delays etc with GWT updating its data, such as these ones: https://www.seroundtable.com/google-webmaster-tools-data-stalled-19854.html https://www.seroundtable.com/google-webmaster-tools-analytics-data-19870.html However these seem to suggest the problem is simply a delay with displayed data being updated, in the case im looking at the data is up to date but showing an increasing decline. When i look at Analytics data though the data is completely different. For exmaple GWT says on the 21st Feb there were 23 impressions with zero clicks but Analytics says there were 6 clicks/sessions from organic search. I take it this means that there is a likely problem with GWT data and I shouldn't worry ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
Sitemap
Hi, I am setting up a new sitemap for our website. the website contains about 8000 - 10.000 pages. Of wich are 6000 productpages. I have 10 categories, about 80 sub-catagories and about 400 sub-sub categories ( these ar my most important landingpages) At this moment our sitemap is only 1 MB. From that point of view 1 sitemap will be enough. But can i take SEO advantage by splitting this sitemap in 10 categories? Or are there other ways to set it up for a better SEO? Thanks!
Technical SEO | | Leonie-Kramer0 -
Should I consider webmaster tools links and linked pages ratio to remove unnatural links?
I don't know this is a suitable place for post this question. Anyway I have done it. According to the Google webmaster tools, Links to your site page. My blog has considerable amount of links, from linked pages (from certain domain names). For an instance please refer following screenshot. When I am removing unnatural links, should I consider these, links from linked pages ratio? Almost all of these sites are social bookmarking sites. When I publish a new bookmark on those sites, they automatically add a homepage link. As a result of that, I got a huge number of home page links from linked pages. What is your recommendation? Thanks! webmaster.png web_master_tools.png
Technical SEO | | Godad0 -
Error in webmaster tools
Hi, I just got an error (12 pages especifically) from webmaster tools when consulting "indexing problems" Something like: The URL doesn't exist, but the server doesn't return a 404 error. What should I do? Many Thanks.
Technical SEO | | juanmiguelcr0 -
Webmaster tools...URL Errors
Hi mozzers, Quick question. Whats the best thing to do about URL errors in webmaster tools. They are all 404s that point from external sites. Many of them are junk spam sites. Should I mark them as "fixed" or just leave them. I'm hoping google is aware it's out of my control if spam sites want to link to 404s on my site. Peter
Technical SEO | | PeterM220 -
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Technical SEO | | greyniumseo0 -
Google & Separators
This is not a question but something to share. If you click on all of these links and compare the results you will see why _ is not a good thing to have in your URLs. http://www.google.com/search?q=blue http://www.google.com/search?q=b.l.u.e http://www.google.com/search?q=b-l-u-e http://www.google.com/search?q=b_l_u_e http://www.google.com/search?q=b%20l%20u%20e If you have any other examples of working separators please comment.
Technical SEO | | Dan-Petrovic3