Crawl errors for pages that no longer exist
-
Hey folks,
I've been working on a site recently where I took a bunch of old, outdated pages down. In the Google Search Console "Crawl Errors" section, I've started seeing a bunch of "Not Found" errors for those pages. That makes perfect sense.
The thing that I'm confused about is that the "Linked From" list only shows a sitemap that I ALSO took down. Alternatively, some of them list other old, removed pages in the "Linked From" list.
Is there a reason that Google is trying to inform me that pages/sitemaps that don't exist are somehow still linking to other pages that don't exist? And is this ultimately something I should be concerned about?
Thanks!
-
Thanks for the question, this can definitely be annoying for webmasters!
Unfortunately, bots can don't everything in parallel. They have to take steps...
Step 1. Take List #1 of links.
Step 2. Crawl those links and build List #2.
Step 3. Crawl List #3 and build List #4...Now, sometimes it doesn't follow that same order. Let's say that in Step 3 it finds a bunch of pages with unique content. Maybe the next time around, it goes and checks some of those links in Step 3 without first checking if they were still linked. Why start the crawl all the way from the beginning again when you have a big list of URLs?
But, this creates a problem. When some of those links it crawled in Step 3 aren't there any more, Google will tell you they aren't there and tell you how they originally found them (which happened to be from a page in List #1). But what if Google hasn't checked that link in List #1 recently? What if you just removed it too?
Well, for a little while, at least, you will end up with errors.
Now, here comes the real rub - how long will it take for Google to find and correct that message it left you in the crawl report? Days? Weeks? Months? Who knows. Your best bet is to mark them as fixed and force Google to keep rechecking. Eventually, they will figure it out.
TL;DR; it is a data freshness and reporting issue that isn't your fault and isn't worth your time.
-
No - Google is just showing how slow it is when updating data in Webmaster tools.
Don't worry - if you wait long enough they'll go away. You could also mark them as solved (do this only if you are sure that there are no links pointing to these pages - to check if your internal linking is ok Screaming Frog is great tool)
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why might my websites crawl rate....explode?
Hi Mozzers, I have a website with approx 110,000 pages. According to search console, Google will usually crawl, on average, anywhere between 500 - 1500 pages per day. However, lately the crawl rate seems to have increased rather drastically: 9/5/16 - 923
Reporting & Analytics | | Silkstream
9/6/16 - 946
9/7/16 - 848
9/8/16 - 11072
9/9/16 - 50923
9/10/16 - 60389
9/11/16 - 17170
9/12/16 - 79809 I was wondering if anyone could offer any insight into why may be happening and if I should be concerned?
Thanks in advance for all advice.0 -
Is there a way to map your on-page SEO changes with the organic growth?
Hi Mozzers, I was just wondering if there's a way we can map our on-page SEO changes with the increase/decrease in organic traffic. For instance, I introduced brand pages' link the product page breadcrumbs and suddenly organic traffic for my brand pages increase from X to 2X in 1 couple of weeks. Now, this can be because of this breadcrumb change purely or because of some algorithm update or may be, bots started finding the content interesting and hence, started ranking them up (in case the brand pages were launched recently). So, you can't say which change should be mapped to what increase/decrease in organic traffic. Or, is there a way to map this?
Reporting & Analytics | | _nitman0 -
How is it possible that this site has a higher page authority than my site?
Judging by open site explorer, I'm crushing my competitor in every imaginable way. And yet, somehow they have a higher page authority than me and, consequently, are ranking higher than me. How is this possible? My site is on the left: 40atcpP.png
Reporting & Analytics | | ScottMcPherson0 -
Can 500 errors hurt rankings for an entire site or just the pages with the errors?
I'm working with a site that had over 700 500 errors after a redesign in april. Most of them were fixed in June, but there are still about 200. Can 500 errors affect rankings sitewide, or just the pages with the errors? Thanks for reading!
Reporting & Analytics | | DA20130 -
Does analytics track an order two times by refresh on the confirmation-page?
Hi there,
Reporting & Analytics | | Webdannmark
I have a quick question. Does Google analytics track an order two times, if the user buys a product, see the confirmation page and then click refresh/click or back and forward again?
The order/tracking data must be the same, but i guess the tracking code runs for every refresh and therefore tracks the order two times in Analytics or does analytics know that it is the same order? Someone that can clearify this?Thanks! Regards
Kasper0 -
What's the difference between landing pages and entrance pages on Google Analytics?
I'm confused about the difference between entrance pages and landing pages on Google Analytics. If I compare our search traffic to entrance pages with search traffic to landing pages it seems very similar -- but not identical. But this is probably because all of our GA analytics is sampled (we're a huge site). Can anyone help?
Reporting & Analytics | | CecilyP0 -
Google Analytics internal Site Search - Destination pages dispaly Search results
Hi, Im having a bit of an issue with Google Analytics internal site search, I am able to currently track the search terms through my website internal search but when I click onto destination pages I just get the search result page. When clicking destination pages I would expect to get the pages on which the user ended up after the results page, instead I just get the results page which is pretty much useless ?submitsearchXXXXXX hope you can help, look forward to your response. Thanks,
Reporting & Analytics | | Tug-Agency1 -
Bug in Virtual Page view javascript
I am trying to learn the basics of tracking outbound links with this javascript code set up on this url http://queenofhats.com/links.php I think I have a bug in my javascript that tracks clicks on an outbound link. http://www.portlandmaine.com When A person clicks on the "Portland Maine" link, in theory it should create a virtual pageload of a page called "portland_maine". I have clicked on the link, but when I filter on "portland" in the Top Content section of Google Analyitcs, there are is no record of the click. So I think my javascript code has a bug. Any comment on the javascript would be most welcome. I just noticed that I have \ before /links. Is the "" the source of the problem? Thanks!!
Reporting & Analytics | | dsexton100