Why might Google be crawling via old sitemap, when the new one has been submitted and verified?
-
We have recently relaunched Scoutzie.com and re-submitted our new sitemap to Google. When I look on Webmaster tools, our new sitemap has been submitted just fine, but at the same time, Google is finding a lot of 404s when crawling the site. My understanding, it is still using crawling the old links, which do not exists. How can I tell Google to refresh it's index and to stop looking at all the old links?
-
Yes it should. However, as Alan mentioned below, if you still have links pointing to the 404 pages, Google will always attempt to crawl them, and will keep you informed that you have errors.
If you do have external links to those 404 pages, you can 301 redirect them to an appropriate page using .htaccess. This way you'll keep the link value and also get rid of the Webmaster Tools error.
If you don't have any links to them, then yes, Google will eventually stop trying to crawl them.
-
It's very likely that we do. Given that I cannot track down a 1000+ links that now 404, will they eventually fall out by themselves, or do I have to tell Google that everything that's 404'ed should be dropped from crawl index? Thanks!
-
What if I simply pushed the new sitemap over the old one? In other words, scoutzie.com/sitemap is the same link, except now it contains the new map. That should be okay, right?
-
you may still have links pointing to those 404 pages on your site or externally. If not then eventually they will fall out of the index
-
Hey scoutzie,
This is actually covered pretty well in Joe Robison's blog post on fixing Webmaster Tools crawl errors: http://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools
I'll quote the related info:
"One frustrating thing that Google does is it will continually crawl old sitemaps that you have since deleted to check that the sitemap and URLs are in fact dead. If you have an old sitemap that you have removed from Webmaster Tools, and you don’t want being crawled, make sure you let that sitemap 404 and that you are not redirecting the sitemap to your current sitemap."
Hope this helps, good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Crawl Error
In moz crawling error this message is appears: MOST COMMON ISSUES 1Search Engine Blocked by robots.txt Error Code 612: Error response for robots.txt i asked help staff but they crawled again and nothing changed. there's only robots.XML (not TXT) in root of my webpage it contains: User-agent: *
Moz Pro | | nopsts
Allow: /
Allow: /sitemap.htm anyone please help me? thank you0 -
Missing Title for Sitemap
Our site is built on Wordpress and we use a very popular SEO plugin called Yoast to generate our sitemap (as well as handle multiple other SEO functions). When MOZ's spider crawls our site, this sitemap triggers an error saying "Missing Title or Empty." My question is how can I avoid having this error hurt me in terms of my rankings. It seems strange to me that such a ubiquitous plugin would be generating something as important as a sitemap in an incorrect format.
Moz Pro | | ShatterBuggy0 -
Has the Crawl Test gone?
Just checked the new Moz, am I right in thinking the super useful crawl test functionality has gone? I use it for existing sites to download all the title tags and meta name descriptions, is there more to come??
Moz Pro | | Karen_Dauncey0 -
Not all pages are being crawled
I am set up on the PRO plan, I was under the impression that it would crawl up to 10,000 pages. My site has just over 200 pages, but whenever I am crawled it only crawls 121 pages. Is this normal? It's hard to know how reliable my data is because a significant amount of pages are missing.
Moz Pro | | KristinHarding0 -
Loss of Google AdWords API
Since August, we've had intermittent access to the Google Adwords API. Like others in this space, they've let us know that our access has been revoked. We've been working to restore this access, but unfortunately, we have not been able to do so. In the meantime we are working hard to find a suitable replacement that is both stable and a good fit for our customers’ needs. We wanted to let our PRO members know, first to apologize, and second to explain how this has impacted your PRO access. We understand how important it is to have all your keyword data in one place, and we're sorry for any disruptions you've experienced from not having this data source in our app.We do want to clarify that this change does not affect your data from Google Analytics. The products affected by this change are our **Keyword Analysis tool and the keyword details and **keyword analysis pages within our web app. We removed this keyword query volume data within PRO a little earlier this year. You can still generate reports, but the metrics previously pulled from the Google Adwords data source is no longer there.The good news is, we don't actually use the Google AdWords data in our calculation of keyword difficulty, so these reports are still 100% accurate—they're just missing the information we usually pull from the API, which you can find on the Google AdWords Keyword Tool here. We're so sorry about this, and understand how frustrating the loss of this data is. We're hoping to have an alternative data source soon, and we will keep you posted as we make progress. Please feel free to comment, ask questions, and get help by replying to this post. Thank you!
Moz Pro | | AaronWheeler7 -
Can you set-up a manual SEOmoz crawl?
I received a crawl report yesterday, made some site changes, and would like to see if those changes were done correctly. Rather than wait a week for my automatic crawl to be generated, is there anyway to initiate a manual crawl on a single subdomain as a PRO member? As a PRO member, you can schedule crawls for 2 subdomains every 24 hours, and you'll get up to 3,000 pages crawled per subdomain. When we've finished crawling, your reports will be sent to your PRO email address, which is currently From here... http://pro.seomoz.org/tools/crawl-test
Moz Pro | | ICM0 -
Pages Crawled: 250 | Limit: 250
One of my campaigns says: Pages Crawled: 250 | Limit: 250 Is this because it's new and the limit will go up to 10,000 after the crawl is complete? I have a pro account, 4 other campaigns running and should be allowed 50,000 pages in total
Moz Pro | | MirandaP0 -
Excluding parameters from seomoz crawl?
I'm getting a ton of duplicate content errors because almost all of my pages feature a "print this page" link that adds the parameter "printable=Y" to the URL and displays a plain text version of the same page. Is there any way to exclude these pages from the crawl results?
Moz Pro | | AmericanOutlets0