Webmaster tools crawl errors
-
Hi there,
iv been tracking my webmaster tools crawl errors for a while now(6 months) and im noticing some pages that are far gone 404 are still poping out on the crawl errors. - that pages have no data for xml linking, and remote linking are from pages that are far gone 404 also.
that pages have 404 error page + redirect to homepage, and google still notice them with old cache content.
does someone have a clue why is this happening?
-
Thank you very much Dana for the superb answer!
any clue for how much this errors are critical for my website seo? (is this problem worth fixing)
-
Hi!
Have you verified that there is a proper 301 redirect from the old URL's? If, for example, there is a 302 instead, I wouldn't be surprised if Google kept the old info in the index, as you have sort of said "I'll be back soon at this old address, just wait a minute (or year)".
Do you have an example URL for us that we could take a look at?
-
Hi Dana,
If your looking for someone to confirm what Dana said she is 100% right.
If your using a data base like WordPress it is posable for it to hold old pages and serve them. Make sure your host knows of your problem if you are using a CMS.
Hope this helps,
Thomas
-
Yes, I understand this issue very well and have seen it many times. Most often, it is happening because another of your pages that is being indexed is still referencing those pages. Please forgive Google, it is but a humble, not-so-intelligent bot (despite what the world would have you think). If you keep referencing a page that doesn't exist, Googlebot will say to itself "but it does exist! it does!" and keep indexing it, despite the 404 error. I mean, for all Google knows, it is your intention to bring that page back and maybe you just screwed up and it is 404-ing, you know?
Here's the remedy:
Do a content audit. Here's a great post on how to do that: http://www.distilled.net/blog/seo/how-to-perform-a-content-audit/
You will discover many things, without a doubt, including pages that are linking to these 404 pages. Decide what to do with those pages, i.e. nix them, re-build them, whatever. If you have pages in Google that you really do want out of the index and those same pages are 404-ing....simply go to your Google Webmaster Tools Account and in the left nav select "Google Index" and then "Remove URLs." Then simply click the box that says "create a new removal request" and enter the desired URL into that Box. Provided your page meets the requirements (and if it's producing a 404 error it does) Google will prioritize its removal. Under no circumstances should this be confused with the disavow tool. Just to make that clear.
I hope this helps. Any questions and I'm happy to help further if I can.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlink Tool Finding Links That Aren't Visible
I'm using a variety of backlink toolsl; OSE, SEM Rush, Link Research Tools and Google Webmaster Tools. On some occasions the tools are telling me there is a link to my site on a certain page, however when I open that page I can't see any mention of my company. I have run all these reports today, so I think it is unlikely that Google is using old data. Does anyone know why this happens and if I should still be concerned about links that I cannot see but backlink tools are telling me are there? Thanks guys.
Reporting & Analytics | | AAttias0 -
Changing URL Parameters in Webmaster Tools
We have a bit of a conundrum. Webmaster tools is telling us that they are crawling too many URLs: Googlebot found an extremely high number of URLs on your site: http://www.uncommongoods.com/ In their list of URL examples, all of the URLs have tons of parameters. We would probably be ok telling Google not to index any of the URLs with parameters. We have a great URL structure. All of our category and product pages have clean links (no parameters) The parameters come only from sorts and filters. We don't have a need for Google to index all of these pages. However, Google Analytics is showing us that over the last year, we received a substantial amount of search revenue from many of these URLs (800+ of them converted) So, Google is telling us they are unhappy. We want to make Google happy by ignoring all of the paramter URLs, but we're worried this will kill the revenue we're seeing. Two questions here: 1. What do we have to lose by keeping everything as-is. Google is giving us errors, but other than that what are the negative repercussions? 2. If we were to de-index all of the parameter URLs via Webmaster tools, how much of the revnenue would likely be recovered by our non-parameter URLs? I've linked to a screenshot from Google Analytics ArxMSMG.jpg
Reporting & Analytics | | znotes0 -
Webmaster Tools Error: Unreachable page
Hi all, When I try to the "Fetch as Google" feature on Webmaster Tools, I get the error Unreachable page. I checked the Google Analytics code, everything seems to be OK. What should I do?
Reporting & Analytics | | fisniks0 -
Can 500 errors hurt rankings for an entire site or just the pages with the errors?
I'm working with a site that had over 700 500 errors after a redesign in april. Most of them were fixed in June, but there are still about 200. Can 500 errors affect rankings sitewide, or just the pages with the errors? Thanks for reading!
Reporting & Analytics | | DA20130 -
Webmaster tools help
Looking at some Google Webmaster tools data and im a bit stumped. Can anyone tell me if the "not selected" pages should be taken from the "total indexed pages" or are they a seperate entity? The way the data is displayed on the graph leads you to believe they should be taken from the total indexed pages, yet Googles own description of "not selected" pages says that these pages are not indexed. Confused.
Reporting & Analytics | | Silkstream0 -
Google Webmaster Tools not displaying all backlinks
I was asked to take a look at a site that a friend redesigned and immediately plummeted in rankings. It was a flash site where the domain redirected to a flash landing page. When the site was redesigned, they removed the flash page and put in the new site on the domain URL without changing the redirect or putting a 301 in place on the old flash landing page URL. This site didn't have many backlinks to begin with, but it did ok for some important search phrases. Now when I look in Webmaster Tools, there are only 2 backlinks showing. Meanwhile I know there are more backlinks in existence, primarily from yellowpages and citysearch. I've had them put a 301 on the old flash landing page pointing at the domain URL and they've added a canonical URL. What have I missed here?
Reporting & Analytics | | BostonWright0 -
Setting up Webmaster Tools correctly - naked domain DNS error and sub-domains question
I'm trying to get our domain (verdantly.com) set up correctly in Google Webmaster Tools. Currently, I have three "sites" setup: blog.verdantly.com (wordpress.com blog redirected to this subdomain) www.verdantly.com verdantly.com The subdomain blog and www show up without errors. However, the naked domain shows a DNS error. I've checked the DNS settings at the registrar and don't see any issues. So here are my questions: 1. Am I correct in setting up the naked domain AND the subdomains separately in Webmaster tools? 2. How do I track down / resolve the source of the DNS errors at the naked domain? Thanks!
Reporting & Analytics | | letsdothis0 -
Page Speed - What tool to use?
I am looking for a good tool to measure page speed. Any tools out there that you recommend?
Reporting & Analytics | | rmontanez0