Huge spike in 404s and 500 erros
-
I'm curious what might cause an inordinate amount of 404s in the reporting from SEOMoz's dashboard.
I'm exploring links that are marked as 404s and they are (for the most part) working. I talked with the sysadmin and there were no outages this weekend. We also had a number of 500 errors reported in Webmaster Tools but everything seems to be up.
Any ideas?
-
Maybe submit a support ticket to SEOmoz to see if the 404's might have been false positives.
-
The interesting thing is that SEOMoz threw a bunch of 404s that, overall, are not 404s whereas Webmaster Tools is showing discontinued products which makes sense. We didn't see any outages this weekend so I'm a bit confused.
-
If SEOmoz and Google Webmaster Tools are both reporting errors, I would guess there actually was a problem with your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disavow 401, 403, 410, 500, 502, 503
Dear people, I am cleaning my backlink profile and I am not sure if I should disavow links that drive you to a: 401, 403, 410, 500, 502, 503. I do understand that since last Penguin update, it won't be necessary, but I would like to be sure about it. Any hints out there? Thanks in advance 🙂
Intermediate & Advanced SEO | | Marta_King_ruiz0 -
I have 6 URL errors in GSC showing a 500 error code. How do I fix?
I am not sure how to fix some errors that are popping up in Google Search Console. The response codes showing are all: 500 error code I need some advice as to how to fix these. What are my options?
Intermediate & Advanced SEO | | pmull0 -
Multiple Sitemaps Vs One Sitemap and Why 500 URLs?
I have a large website with rental listings in 14 markets, listings are added and taken off weekly if not daily. There are hundreds of listings in each market and all have their own landing page with a few pages associated. What is the best process here? I could run one sitemap and make each market's landing page .8 priority in the sitemap or make 14 sitemaps for each market and then have one sitemap for the general and static pages. From there, what would be the better way to structure? Should I keep all the big main landing pages in the general static sitemap or have them be at the top of the market segmented sitemaps? Also, I have over 5,000 urls, what is the best way to generate a sitemap over 500 urls? Is it necessary?
Intermediate & Advanced SEO | | Dom4410 -
Have thousands of 404s with backlinks. Should I redirect them all at once or over time?
These error pages are being redirected to the most relevant page, not mass redirected to the home page. Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
How does Google index pagination variables in Ajax snapshots? We're seeing random huge variables.
We're using the Google snapshot method to index dynamic Ajax content. Some of this content is from tables using pagination. The pagination is tracked with a var in the hash, something like: #!home/?view_3_page=1 We're seeing all sorts of calls from Google now with huge numbers for these URL variables that we are not generating with our snapshots. Like this: #!home/?view_3_page=10099089 These aren't trivial since each snapshot represents a server load, so we'd like these vars to only represent what's returned by the snapshots. Is Google generating random numbers going fishing for content? If so, is this something we can control or minimize?
Intermediate & Advanced SEO | | sitestrux0 -
Huge Google index on E-commerce site
Hi Guys, Refering back to my original post I would first like to thank you guys for all the advice. We implemented canonical url's all over the site and noindexed some url's with robots.txt and the site already went from 100.000+ url's indexed to 87.000 urls indexed in GWT. My question: Is there way to speed this up?
Intermediate & Advanced SEO | | ssiebn7
I do know about the way to remove url's from index (with noindex of robots.txt condition) but this is a very intensive way to do so. I was hoping you guys maybe have a solution for this.. 🙂0 -
Transfer webshop to other domain. Will there be a huge visit/sales drop?
A client of main has a specific domain for their webshop, separately from the brand the domain. The brand domain has much more authority (according to SEOMoz), so the conclusion in an other topic was that it would be better to move the entire webshop to the same domain as the branddomain. Like moving www.webshopdomain.com to www.branddomain.com/webshop Of course all categories and important pages will have a 301 to give through the build up authority. Does anybody have experience with this? I believe in the end this will be much better because all authority will be build up at the same domain.
Intermediate & Advanced SEO | | Seeders
But I am afraid of a drop in the beginning. If there will be a sales drop, I really must give my client notice of this... I hope somebody has done this before..0 -
Old pages still crawled by SE returning 404s. Better to put 301 or block with robots.txt ?
Hello guys, A client of ours has thousand of pages returning 404 visibile on googl webmaster tools. These are all old pages which don't exist anymore but Google keeps on detecting them. These pages belong to sections of the site which don't exist anymore. They are not linked externally and didn't provide much value even when they existed What do u suggest us to do: (a) do nothing (b) redirect all these URL/folders to the homepage through a 301 (c) block these pages through the robots.txt. Are we inappropriately using part of the crawling budget set by Search Engines by not doing anything ? thx
Intermediate & Advanced SEO | | H-FARM0