Increased 404 and Blocked URL Notifications in Webmaster Tools
-
In the last 45 days, I am receiving an increasing number of 404 alerts in Google Webmaster Tools.
When I audit the notifications, they are not "new" broken links, these are all links that have been pointing to non-existent pages for years that for some reason Google is just notifying me about them. This has also coincided with about a 30% drop in organic traffic from late April to early May.
The site is www.petersons.com and its been around for a while and the site attracts a fair amount of natural links so in the 2 years I've managed the campaign I've done very little link-building.
I'm in the process of setting up redirects for these urls but why is Google now notifying me of years old broken links and could that be one of the reasons for my drop in traffic.
My second issue is my I am being notified that I am blocking over 8,000 urls in my Robots file when I am not. I attached a screenshot.
Here is a link to a screenshot. http://i.imgur.com/ncoERgV.jpg
-
I doubt very much that an increase in old 404s resulted in a 30% organic traffic drop. I'd look closely at your backlink profile, competition and page quality to try and diagnose why you saw that drop in traffic.
As for the 404s I'd fix those that are fixable and 301 redirect the rest to relevant pages (or the home page). If the number is extremely large then you should put a high priority on fixing this. Otherwise I haven't met a site that Google couldn't find a 404 error on. And yeah, they keep telling you about the same ones!
Hope that helps!
Jacob
-
Hi!
As Lynn points out, there could be some issues in regards to your perceived uptime. Do you see a lot of 404 errors reported in Analytics as well? If this is the case, perhaps your hosting provider (or IT department) should have a look at this?
Also, adding the redirects seems like a good idea, as Google couold be reindexing some sites/pages linking to the old, deleted URL's.
Do you have a custom crawl frequency set up in Google Webmaster Tools? It's worth looking into if Googlebot is slowing down your site.
Good luck.Anders
-
The Moz scan is not showing the same errors. And we haven't made any technological changes. These are incoming links pointing to pages that don't exist anymore. It looks like its been that way for years, I just started getting notified of these and I'm wondering if somehow it is hurting the site.
About the robots file, I just don't know. I've decided to make it blank and re-assess in a few days.
-
Hi,
A bit difficult to say without some more details. Some of it might be outdated information. See: http://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools for a rundown on how to check it if you haven't already. What urls is it flagging from the robots.txt? Are they still valid urls? In regards the 404s, 28,000 is quite a lot. Has your system changed or been updated recently? Maybe there is a systemic fault going on that is creating these errors? Is the moz scan flagging the same errors?
It is tough to say if the errors have any connection to the drop in visits, but it is certainly something you want to get to the bottom of. I threw your site into xenu (http://home.snafu.de/tilman/xenulink.html) and it was timing out on quite a few of the pages. Is it possible the site is timing out on heavy loads? That might account for the drop in organic visits also...
Lots of questions, not many answers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google search console: 404 and soft 404 without any back-links. Redirect needed?
Hi Moz community, We can see the 404 and soft 404 errors in Google web masters. Usually these are non-existing pages which are found somewhere on internet by Google. I can see some of these reported URLs don't have any back-links (checked on ahrefs tool). Do we need to redirect each and every link reported here or ignore or marked to be fixed? Thanks
Algorithm Updates | | vtmoz0 -
Ranking gone for the original page and a shortened url ranks instead.
Hi Experts!!! Wishing you all a Merry Christmas and Happy New Year In Advance. I am been facing a issue with a few of my SERP results for "Singapore Visa" and related keyword. Until last to last Saturday i.e 16th December, I ranked for Singapore visa keyword with this url https://in.musafir.com/Visa/singapore-visa.aspx !!! But since 18th December I am ranking for "Singapore Visa" keyword with this url and message below it in place of description. Singapore visa - Musafir.com
Algorithm Updates | | sainath
go.musafir.com/Singapore-visa
No information is available for this page.
Learn why The go.musafir.com/Singapore-visa redirects to https://in.musafir.com/Visa/singapore-visa.aspx with some UTM parameters. The URL go.musafir.com/Singapore-visa is a shortened URL which was used for SMS marketing and all of a sudden Google has picked it in SERP instead of Singapore VIsa Landing Page. The Singapore visa Main page is not blocked by Robots.txt file. Please help me to resolve this.1 -
Should one end URLs with or without a slash?
Moz, I am noticing that I need to go back and update my outbound links to your site. There are a lot of them because your content is so great and we love you guys. Could you explain your logic for making the change? Example on my Valid JSON-LD image sizes page: [https://moz.com/blog/state-of-searcher-behavior-revealed/](https://moz.com/blog/state-of-searcher-behavior-revealed/) redirected to: [https://moz.com/blog/state-of-searcher-behavior-revealed](https://moz.com/blog/state-of-searcher-behavior-revealed)
Algorithm Updates | | jessential0 -
Duplicate pages in language versions, noindex in sitemap and canonical URLs in sitemap?
Hi SEO experts! We are currently in the midst of reducing our amount of duplicate titles in order to optimize our SEO efforts. A lot of the "duplicate titles" come from having several language versions of our site. Therefore, I am wondering: 1. If we start using "" to make Google (and others) aware of alternative language versions of a given site/URL, how big a problem will "duplicate titles" then be across our domains/site versions? 2. Is it a problem that we in our sitemap include (many) URL's to pages that are marked with noindex? 3. Are there any problems with having a sitemap that includes pages that includes canonical URL's to other pages? Thanks in advance!
Algorithm Updates | | TradingFloor.com0 -
Google Sign-In increasing organic encryption keywords?
I am curious how brands that have implemented Google Sign in dealing with the organic encryption keywords. Have encrypted keywords increased after applying Google Sign-in?
Algorithm Updates | | LNEseo
How are you dealing with the missing keyword information?0 -
Struggling with Google Bot Blocks - Please help!
I own a site called www.wheretobuybeauty.com.au After months and months we still have a serious issue with all pages having blocked URLs according to Google Webmaster Tools. The 404 errors are returning a 200 header code according to the email below. Do you agree that the 404.php code should be changed? Can you do that please ? The current state: Google webmaster tools Index Status shows: 26,000 pages indexed 44,000 pages blocked by robots. In late March, we implemented a change recommended by an SEO expert and he provided a new robots.txt file, advised that we should amend sitemap.xml and other changes. We implemented those changes and then setup a re-index of the site by google. The no of blocked URLs eventually reduced in May and June to 1,000 for a few days – but now the problem has rapidly returned. The no of pages that are displayed in a google search request of www.google.com.au where the query was ‘site:wheretobuybeauty.com.au’ is 37,000: This new site has been re-crawled over last 4 weeks. About the site This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster tools robots.txt file has been modified several times: Firstly we had none Then we created one but were advised that it needed to have this current content: User-agent: * Disallow: Sitemap: http://www.wheretobuybeauty.com.au/sitemap.xml
Algorithm Updates | | socialgrowth0 -
URL Names not so important in future?
I read somewhere (hard to say where with all the information about SEO and google!) that in the future, Google will put less importance on the URL name for ranking purposes. Any thoughts?
Algorithm Updates | | Llanero0 -
Local SEO url format & structure: ".com/albany-tummy-tuck" vs ".com/tummy-tuck" vs ".com/procedures/tummy-tuck-albany-ny" etc."
We have a relatively new site (re: August '10) for a plastic surgeon who opened his own solo practice after 25+ years with a large group. Our current url structure goes 3 folders deep to arrive at our tummy tuck procedure landing page. The site architecture is solid and each plastic surgery procedure page (e.g. rhinoplasty, liposuction, facelift, etc.) is no more than a couple clicks away. So far, so good - but given all that is known about local seo (which is a very different beast than national seo) quite a bit of on-page/architecture work can still be done to further improve our local rank. So here a a couple big questions facing us at present: First, regarding format, is it a given that using geo keywords within the url indispustibly and dramatically impacts a site's local rank for the better (e.g. the #2 result for "tummy tuck" and its SHENANIGANS level use of "NYC", "Manhattan", "newyorkcity" etc.)? Assuming that it is, would we be better off updating our cosmetic procedure landing page urls to "/albany-tummy-tuck" or "/albany-ny-tummy-tuck" or "/tummy-tuck-albany" etc.? Second, regarding structure, would we be better off locating every procedure page within the root directory (re: "/rhinoplasty-albany-ny/") or within each procedure's proper parent category (re: "/facial-rejuvenation/rhinoplasty-albany-ny/")? From what I've read within the SEOmoz Q&A, adding that parent category (e.g. "/breast-enhancement/breast-lift") is better than having every link in the root (i.e. completely flat). Third, how long before google updates their algorithm so that geo-optimized urls like http://www.kolkermd.com/newyorkplasticsurgeon/tummytucknewyorkcity.htm don't beat other sites who do not optimize so aggressively or local? Fourth, assuming that each cosmetic procedure page will eventually have strong link profiles (via diligent, long term link building efforts), is it possible that geo-targeted urls will negatively impact our ability to rank for regional or less geo-specific searches? Thanks!
Algorithm Updates | | WDeLuca0