Increased 404 and Blocked URL Notifications in Webmaster Tools
-
In the last 45 days, I am receiving an increasing number of 404 alerts in Google Webmaster Tools.
When I audit the notifications, they are not "new" broken links, these are all links that have been pointing to non-existent pages for years that for some reason Google is just notifying me about them. This has also coincided with about a 30% drop in organic traffic from late April to early May.
The site is www.petersons.com and its been around for a while and the site attracts a fair amount of natural links so in the 2 years I've managed the campaign I've done very little link-building.
I'm in the process of setting up redirects for these urls but why is Google now notifying me of years old broken links and could that be one of the reasons for my drop in traffic.
My second issue is my I am being notified that I am blocking over 8,000 urls in my Robots file when I am not. I attached a screenshot.
Here is a link to a screenshot. http://i.imgur.com/ncoERgV.jpg
-
I doubt very much that an increase in old 404s resulted in a 30% organic traffic drop. I'd look closely at your backlink profile, competition and page quality to try and diagnose why you saw that drop in traffic.
As for the 404s I'd fix those that are fixable and 301 redirect the rest to relevant pages (or the home page). If the number is extremely large then you should put a high priority on fixing this. Otherwise I haven't met a site that Google couldn't find a 404 error on. And yeah, they keep telling you about the same ones!
Hope that helps!
Jacob
-
Hi!
As Lynn points out, there could be some issues in regards to your perceived uptime. Do you see a lot of 404 errors reported in Analytics as well? If this is the case, perhaps your hosting provider (or IT department) should have a look at this?
Also, adding the redirects seems like a good idea, as Google couold be reindexing some sites/pages linking to the old, deleted URL's.
Do you have a custom crawl frequency set up in Google Webmaster Tools? It's worth looking into if Googlebot is slowing down your site.
Good luck.Anders
-
The Moz scan is not showing the same errors. And we haven't made any technological changes. These are incoming links pointing to pages that don't exist anymore. It looks like its been that way for years, I just started getting notified of these and I'm wondering if somehow it is hurting the site.
About the robots file, I just don't know. I've decided to make it blank and re-assess in a few days.
-
Hi,
A bit difficult to say without some more details. Some of it might be outdated information. See: http://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools for a rundown on how to check it if you haven't already. What urls is it flagging from the robots.txt? Are they still valid urls? In regards the 404s, 28,000 is quite a lot. Has your system changed or been updated recently? Maybe there is a systemic fault going on that is creating these errors? Is the moz scan flagging the same errors?
It is tough to say if the errors have any connection to the drop in visits, but it is certainly something you want to get to the bottom of. I threw your site into xenu (http://home.snafu.de/tilman/xenulink.html) and it was timing out on quite a few of the pages. Is it possible the site is timing out on heavy loads? That might account for the drop in organic visits also...
Lots of questions, not many answers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What kind of impact does a 404 have in a sitemap regarding ranking?
We recently had a site update where our robots file disallowed our sitemap for about two weeks. When we found the problem and resubmitted the sitemap to Google Search Console, it found a 404 error. Does this have any impact on ranking or visibility if we are still recovering from the disallow?
Algorithm Updates | | GaryBlanchard0 -
How to hide our duplicate pages from SERP? Best practice to increase visibility to new pages?
Hi all, We have total 4 pages about same topic and similar keywords. These pages are from our main domain and sub domains too. As the pages from sub domains are years old and been receiving visits from SERP, they stick to 1st position. But we have recently created new pages on our main domain which we are expecting to rank on 1st position. I am planning to hide the sub domain pages from SERP using "Remove URLs" for some days to increase visibility to new pages from main domain. Is this the right and best practice to proceed with? Thanks
Algorithm Updates | | vtmoz0 -
In how much time will my search visibility increase?
Hey I really wanna know how much time will take to my search visibility increase!! Thanks
Algorithm Updates | | THOMAZDANDREA10 -
Flat Structure URL vs Structured Sub-directory URL
We are finally taking our classifieds site forward and moving into a much improved URL structure, however, there is some disagreement over whether to go with a Flat URL structure or a structured sub-directory. I've browsed all of the posts and Q&A's for this going back to 2011, and still don't feel like I have a real answer. Has anyone tested this yet, or is there any consensus over ranking? I am in a disagreement with another SEO manager about this for our proposed URL structure redesign who is for it because it is what our competitors are doing. Our classifieds are geographically based, and we group by state, county, and city. Most of our traffic comes from state and county based searches. We also would like to integrate categories into the URL for some of the major search terms we see. The disagreement arises around how to structure the site. I prefer the logical sub-directory style: [sitename]/[category]/[state]/[county]/
Algorithm Updates | | newspore
mysite.com/for-sale/california/kern-county/
or
[sitename]/[category]/[county]-county-[stateabb]/
mysite.com/for-sale/kern-county-ca/ I don't mind the second, except for when you look at it in the context of the whole site: Geo Landing Pages:
mysite.com/california/
mysite.com/los-angeles-ca-90210/ Actual Search Pages:
mysite.com/for-sale/orange-ca/[filters] Detail Pages:
mysite.com/widget-type/cool-product-name/productid I want to make sure this flat structure performs better before sacrificing my analytics sanity (and ordered logic). Any case studies, tests or real data around this would be most helpful, someone at Moz must've tackled this by now!0 -
Google keyword tool
I was quite happy with google keyword tool for basic and accurate searches for keywords. Can anyone suggests a new tool that will give accurate search volume on google ( country specific ) I am not interest in info for adwords, and find a keyword planner tool way out in traffic results, compared to Keyword tool. Is the keyword tool completely gone?
Algorithm Updates | | summer3000 -
Keyword stuffing in URL? Ekk. Help Please.
Okay, so I work as content manager in the travel industry and we're re-doing our site, pretty much from scratch, including the SEO, anchor text/route url, etc. I am struggling with one particular thing. If all my url's have similar keywords, ie example.com/atlanta-trip and example.com/boston-trip and so on and so forth for every destination, will using "trip" in the url be seen by Google as keyword stuffing? Should I make my url's more diverse? My gut feeling is no based on all the Moz, Google and other SEO research I've done, because it's all relevant to the content and the user experience, but I'd like to be sure, since we really can't afford to get penalized by Google...again.
Algorithm Updates | | hpeisach0 -
Similar URLs
If I have two similar urls: www.investormill.com/unemployment-rate and www.investormill.com/unemployment-rate-annual Would this confuse search engines or "cannibalize" my content? For clarity: the first page would provide data on the monthly unemployment rate, the second would provide an annual unemployment rate figure. So, there would be a unique series on each page. Just trying to figure out how to best approach this when crafting urls. Thanks for your help!
Algorithm Updates | | investormill0 -
How do you block incoming links to your site?
With the new update to google focusing on link spam and multiple anchor text ? If you have incoming links that you would like to block or make no follow?
Algorithm Updates | | HelpingHandNetwork1