Soft 404 errors
-
Google webmaster tools is telling me I have 8 "soft 404's".
They are all like this page...
http://www.seadwellers.com/search/page/8/All 8 pages are the same except the number at the end......
I just can't figure this....any insight at all is appreciated and do i need to correct somehow?
-
When you say "exclude your search"...do you mean exclude that soft 404 page specifically from the search bots?
Can you take a quick peek at one of the pages in question...?
http://www.seadwellers.com/search/page/8/There are 8 of them....only diff is the number at end of URL...I am afraid I do not understand how/why it exists?
Any insight you might have time to give is greatly appreciated...I'm trying to learn.
Thank you Martijn
-
Patrick is right, usually the soft 404 errors are a sign that your pages are very low quality and probably don't provide anymore information. What I would recommend doing is exclude your search definitely from the bots itself by adding it to your robots.txt file.
-
Hi there
Take a look at Google's URL patterns and exclude URLs resources. These are great pieces to help you handle your search URLs and patterns.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Broken canonical link errors
Hello, Several tools I'm using are returning errors due to "broken canonical links". However, I'm not too sure why is that. Eg.
Technical SEO | | GhillC
Page URL: domain.com/page.html?xxxx
Canonical link URL: domain.com/page.html
Returns an error. Any idea why? Am I doing it wrong? Thanks,
G1 -
Getting rid of pagination - redirect all paginated pages or leave them to 404?
Hi all, We're currently in the process of updating our website and we've agreed that one of the things we want to do is get rid of all our pagination (currently used on the blog and product review areas) and instead implement load more on scroll. The question I have is... should we redirect all of the paginated pages and if so, where to? (My initial thoughts were either to the blog homepage or to the archive page) OR do we leave them to just 404? Bear in mind we have thousands of paginated pages 😕 Here's our blog area btw - https://www.ihasco.co.uk/blog Any help would be appreciated, thanks!
Technical SEO | | iHasco0 -
When rogerbot tried to crawl my site it gets a 404\. Why?
When rogerbot tries to craw my site it tries http://website.com. My website then tries to redirect to http://www.website.com and is throwing a 404 and ends up not getting crawled. It also throws a 404 when trying to read my robots.txt file for some reason. We allow rogerbot user agent so unsure whats happening here. Is there something weird going on when trying to access my site without the 'www' that is causing the 404? Any insight is helpful here. Thanks,
Technical SEO | | BlakeBooth0 -
Quest about 404 Errors
About two months ago, we deleted some unnecessary pages on our website that were no longer relevant. However, MOZ is still saying that these deleted pages are returning 404 errors when a crawl test is done. The page is no longer there, at least that I can see. What is the best solution for this? I have a page that similar to the older page, so is it a good choice to just redirect the bad page to my good page? If so, what's the best way to do this. I found some useful information searching but none of it truly pertained to me. I went around my site to make sure there were no old links that directed traffic to the non existent page, and there are none.
Technical SEO | | Meier0 -
Google couldn't access your site because of a DNS error
Hello, We've being doing SEO work for a company for about 8 months and it's been working really well, we've lots of top threes and first pages. Or rather we did. Unfortunately the web host who the client uses (who to recommended them not to) has had severe DNS problems. For the last three weeks Google has been unable to access and index the website. I was hoping this was going to be a quickly resolved and everything return to normal. However this week their listing have totally dropped, 25 page one rankings has become none, Google Webmaster tools says 'Google couldn't access your site because of a DNS error'. Even searching for their own domain no longer works! Does anyone know how this will effect the site in the long term? Once the hosts sort it out will the rankings bounce back. Is there any sort of strategy for handling this problem? Ideally we'd move host but I'm not sure that is possible so any other options, or advice on how it will affect long term rankings so I can report to my client would be appreciated. Many thanks Ric
Technical SEO | | BWIRic0 -
Nginx 403 and 503 errors
I have a client with a website that is hosted on a shared webserver running on an Nginx server. When I started working on the website a few months ago I found the server was throwing 100s of 403s and 503s and at one point googlebot couldn't access robots.txt. Needless to say this didn't help rankings! Now the web hosting company has partially resolved the errors by switching to a new server and I'm now just seeing intermittent spikes in Webmaster Tools of 30 to 70 403 ad 503 errors. My questions: Am I right in saying there should (pretty much) be no such errors (for pages that we make public and crawlable). Having already asked the web hosting company to look in to this. Any advice on specifically what I should be asking them to look at on the server? If this doesn't work out, does anyone having a recommendation for a reliable web hosting company in the U.S. for a lead generation website with over 20,000 pages and currently 500 to 1000 visits per day? Thanks for the help Mozzers 🙂
Technical SEO | | MatShepSEO0 -
Having a massive amount of duplicate crawl errors
Im having over 400 crawl errors over duplicate content looking like this: http://www.mydomain.com/index.php?task=login&prevpage=http%3A%2F%2Fwww.mydomain.com%2Ftag%2Fmahjon http://www.mydomain.com/index.php?task=login&prevpage=http%3A%2F%2Fwww.mydomain.com%2Findex.php%3F etc.. etc... So there seems to be something with my login script that is not working, Anyone knows how to fix this? Thanks
Technical SEO | | stanken0 -
Duplicate content error - same URL
Hi, One of my sites is reporting a duplicate content and page title error. But it is the same page? And the home page at that. The only difference in the error report is a trailing slash. www.{mysite}.co.uk www.{mysite}.co.uk/ Is this an easy htaccess fix? Many thanks TT
Technical SEO | | TheTub1