WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
-
Hello Moz Community.
I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages. Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure.
If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks.
-
Hi Jane, thanks for the follow up. Every time we see errors showing up in WMT (mainly 404's) we remove the URL's right away and indeed we see the errors going down every 4-5 days (under HTML improvements).
I am just surprised, that if we would not use the URL removal tool, how long it takes for Google to actually remove 404's from their index. I know the higher the PR, the more likely they crawl more often and the faster they remove these 404's I guess, but still.
-
Hi again,
Four months seems abnormally long, but it could have something to do with how many 404s are are - 400 is pretty high. Is this number at least going down every few weeks in WMT?
Cheers,
Jane
-
hi Jane, we've solved the cause of these errors more than 4 months ago at this point. There is no path to these urls anymore, but they keep showing up so it takes Google pretty long to clean up. And our estimate is that there about 400 more of these 404 errors so we still have some time to go I guess.
-
Hi,
How long have these errors been appearing since you fixed the issue? It could be a case of Google looking for URLs on the site that it has seen in the past, even though there is no path to them anymore. With the pathway gone, it should stop looking, but I'm curious how long the issue has been fixed for?
-
I hate to speculate on anything involving SEO, but I've always taken those 404s as visits Google has been able to grab data for. If Webmasters is able to catch the data for a visit to a 404, it'll let you know about it.
What lead me to this cringe assumption cringe was how similar those 404s were to existing pages, like someone tried to type in a URL and got it wrong, or deleted some of it and hit "enter".
Take the info for what it's worth, which isn't fact, just an idea to get you rolling.
-
I've had those too and they are quiet annoying (love seeing 0 errors hehe). I just mark and fixed and hope it doesn't show up again (usually stops appearing after doing that once or twice).
If anyone has another other insight into this, please share!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keep getting "/feed" broken links in Google Search Console
Hey guys, I'm having an issue for the past few months. I keep getting "/feed" broken links in Google Search Console (screenshot attached). The site is a WordPress site using the YoastSEO plugin for on-page SEO and sitemap. Has anyone else experienced this issue? Did you fix it? How should I redirect these links? s7elXMy
Technical SEO | | Extima-Christian0 -
Soft 404 errors
Google webmaster tools is telling me I have 8 "soft 404's". They are all like this page...
Technical SEO | | sdwellers
http://www.seadwellers.com/search/page/8/ All 8 pages are the same except the number at the end...... I just can't figure this....any insight at all is appreciated and do i need to correct somehow?0 -
URL not indexed but shows in results?
We are working on a site that has a whole section that is not indexed (well a few pages are). There is also a problem where there are 2 directories that are the same content and it is the incorrect directory with the indexed URLs. The problem is if I do a search in Google to find a URL - typically location + term then I get the URL (from the wrong directory) up there in the top 5. However, do a site: for that URL and it is not indexed! What could be going on here? There is nothing in robots or the source, and GWT fetch works fine.
Technical SEO | | MickEdwards0 -
"Extremely high number of URLs" warning for robots.txt blocked pages
I have a section of my site that is exclusively for tracking redirects for paid ads. All URLs under this path do a 302 redirect through our ad tracking system: http://www.mysite.com/trackingredirect/blue-widgets?ad_id=1234567 --302--> http://www.mysite.com/blue-widgets This path of the site is blocked by our robots.txt, and none of the pages show up for a site: search. User-agent: * Disallow: /trackingredirect However, I keep receiving messages in Google Webmaster Tools about an "extremely high number of URLs", and the URLs listed are in my redirect directory, which is ostensibly not indexed. If not by robots.txt, how can I keep Googlebot from wasting crawl time on these millions of /trackingredirect/ links?
Technical SEO | | EhrenReilly0 -
"noindex" internal search result urls
Hi, Would applying "noindex" on any page (say internal search pages) or blocking via robots text, skew up the internal site search stats in Google Analytics? Thanks,
Technical SEO | | RaksG0 -
My 404 page shows in the report as an error.
How can i make my actual 404 page not show up as a 404 error in the report?
Technical SEO | | LindseyNewman0 -
Impact of "restricted by robots" crawler error in WT
I have been wondering about this for a while now with regards to several of my sites. I am getting a list of pages that I have blocked in the robots.txt file. If I restrict Google from crawling them, then how can they consider their existence an error? In one case, I have even removed the urls from the index. And do you have any idea of the negative impact associated with these errors. And how do you suggest I remedy the situation. Thanks for the help
Technical SEO | | phogan0 -
Duplicate content error - same URL
Hi, One of my sites is reporting a duplicate content and page title error. But it is the same page? And the home page at that. The only difference in the error report is a trailing slash. www.{mysite}.co.uk www.{mysite}.co.uk/ Is this an easy htaccess fix? Many thanks TT
Technical SEO | | TheTub1