Increased 404 and Blocked URL Notifications in Webmaster Tools
-
In the last 45 days, I am receiving an increasing number of 404 alerts in Google Webmaster Tools.
When I audit the notifications, they are not "new" broken links, these are all links that have been pointing to non-existent pages for years that for some reason Google is just notifying me about them. This has also coincided with about a 30% drop in organic traffic from late April to early May.
The site is www.petersons.com and its been around for a while and the site attracts a fair amount of natural links so in the 2 years I've managed the campaign I've done very little link-building.
I'm in the process of setting up redirects for these urls but why is Google now notifying me of years old broken links and could that be one of the reasons for my drop in traffic.
My second issue is my I am being notified that I am blocking over 8,000 urls in my Robots file when I am not. I attached a screenshot.
Here is a link to a screenshot. http://i.imgur.com/ncoERgV.jpg
-
I doubt very much that an increase in old 404s resulted in a 30% organic traffic drop. I'd look closely at your backlink profile, competition and page quality to try and diagnose why you saw that drop in traffic.
As for the 404s I'd fix those that are fixable and 301 redirect the rest to relevant pages (or the home page). If the number is extremely large then you should put a high priority on fixing this. Otherwise I haven't met a site that Google couldn't find a 404 error on. And yeah, they keep telling you about the same ones!
Hope that helps!
Jacob
-
Hi!
As Lynn points out, there could be some issues in regards to your perceived uptime. Do you see a lot of 404 errors reported in Analytics as well? If this is the case, perhaps your hosting provider (or IT department) should have a look at this?
Also, adding the redirects seems like a good idea, as Google couold be reindexing some sites/pages linking to the old, deleted URL's.
Do you have a custom crawl frequency set up in Google Webmaster Tools? It's worth looking into if Googlebot is slowing down your site.
Good luck.Anders
-
The Moz scan is not showing the same errors. And we haven't made any technological changes. These are incoming links pointing to pages that don't exist anymore. It looks like its been that way for years, I just started getting notified of these and I'm wondering if somehow it is hurting the site.
About the robots file, I just don't know. I've decided to make it blank and re-assess in a few days.
-
Hi,
A bit difficult to say without some more details. Some of it might be outdated information. See: http://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools for a rundown on how to check it if you haven't already. What urls is it flagging from the robots.txt? Are they still valid urls? In regards the 404s, 28,000 is quite a lot. Has your system changed or been updated recently? Maybe there is a systemic fault going on that is creating these errors? Is the moz scan flagging the same errors?
It is tough to say if the errors have any connection to the drop in visits, but it is certainly something you want to get to the bottom of. I threw your site into xenu (http://home.snafu.de/tilman/xenulink.html) and it was timing out on quite a few of the pages. Is it possible the site is timing out on heavy loads? That might account for the drop in organic visits also...
Lots of questions, not many answers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How To Increase Blog DA to 40
Hey, I am wondering to know the key metrics which MOZ considers before giving a blog of DA 40+. I have tried to search but couldn't find the best answers so was wondering how it's achieved. For example a site named "et20slam" have a DA of 20 but i see they have links from many sites. I have checked other sites as well for example and they have less RDs but still had higher DA. Can anyone help me to explain Moz DA 2.0 factors a bit more. Thanks
Algorithm Updates | | Usman40040 -
Indexed, though blocked by robots.txt: Need to bother?
Hi, We have intentionally blocked some of the website files which were indexed for years. Now we receive a message "Indexed, though blocked by robots.txt" in GSC. We can ignore as per my knowledge? Are any actions required about this? We thought of blocking them with meta tags but these are PDF files. Thanks
Algorithm Updates | | vtmoz1 -
Thousands of duplicate website links at "Who links the most" in Google webmasters. Any risk being duplicate website links pointing to website?
Hi all, As I mentioned some days back here, our duplicate website got indexed a month back. Unfortunately there are links to our original website. I noticed that thousands of links are from our duplicate website at "Links to Your Site". Will this hurts? Now we have blocked the duplicate website getting indexed. What to do to remove these links from "Who links the most"? Thanks
Algorithm Updates | | vtmoz0 -
Is it possible (or advisable) to try to rank for a keyword that is 'split' across subfolders in your url?
For example, say your keyword was 'funny hats' - ideally you'd make your url 'website.com/funny-hats/' But what if 'hats' is already a larger category in your site that you want to rank for as its own keyword? Could you then try to rank for 'funny hats' using the url 'website.com/hats/funny/' ? Basically what I'm asking is, would it be harmful to the chances of ranking for your primary keyword if it's split across the url like this, and not necessarily in the correct order?
Algorithm Updates | | rwat0 -
In how much time will my search visibility increase?
Hey I really wanna know how much time will take to my search visibility increase!! Thanks
Algorithm Updates | | THOMAZDANDREA10 -
Webmaster Guidelines Change History
Does any one have the dates of changes to Googles Webmaster Guidelines?
Algorithm Updates | | MiroAsh0 -
Choosing domain name - ccTLD vs Vanity URL
I have to choose between a country specific domain name that is long and difficult to remember, vs or a .me domain which is short and contains the exact keywords I'm optimising for. The challenge is that I'm only targeting local search traffic for the service I am advertising. Does a country specific domain name have any benefits in terms of weighting when I'm only interested in traffic from that country?
Algorithm Updates | | flashie0 -
Local SEO url format & structure: ".com/albany-tummy-tuck" vs ".com/tummy-tuck" vs ".com/procedures/tummy-tuck-albany-ny" etc."
We have a relatively new site (re: August '10) for a plastic surgeon who opened his own solo practice after 25+ years with a large group. Our current url structure goes 3 folders deep to arrive at our tummy tuck procedure landing page. The site architecture is solid and each plastic surgery procedure page (e.g. rhinoplasty, liposuction, facelift, etc.) is no more than a couple clicks away. So far, so good - but given all that is known about local seo (which is a very different beast than national seo) quite a bit of on-page/architecture work can still be done to further improve our local rank. So here a a couple big questions facing us at present: First, regarding format, is it a given that using geo keywords within the url indispustibly and dramatically impacts a site's local rank for the better (e.g. the #2 result for "tummy tuck" and its SHENANIGANS level use of "NYC", "Manhattan", "newyorkcity" etc.)? Assuming that it is, would we be better off updating our cosmetic procedure landing page urls to "/albany-tummy-tuck" or "/albany-ny-tummy-tuck" or "/tummy-tuck-albany" etc.? Second, regarding structure, would we be better off locating every procedure page within the root directory (re: "/rhinoplasty-albany-ny/") or within each procedure's proper parent category (re: "/facial-rejuvenation/rhinoplasty-albany-ny/")? From what I've read within the SEOmoz Q&A, adding that parent category (e.g. "/breast-enhancement/breast-lift") is better than having every link in the root (i.e. completely flat). Third, how long before google updates their algorithm so that geo-optimized urls like http://www.kolkermd.com/newyorkplasticsurgeon/tummytucknewyorkcity.htm don't beat other sites who do not optimize so aggressively or local? Fourth, assuming that each cosmetic procedure page will eventually have strong link profiles (via diligent, long term link building efforts), is it possible that geo-targeted urls will negatively impact our ability to rank for regional or less geo-specific searches? Thanks!
Algorithm Updates | | WDeLuca0