What if 404 Error not possible?
-
Hi Everyone,
I get an 404 error in my page if the URL is simply wrong, but for some parameters, like if a page has been deleted, or has expired, I get an error page indicating that the ID is wrong, but no 404 error.
It is for me very difficult to program a function in php that solve the problem and modify the .htaccess with the mod_rewrite. I ask the developer of the system to give a look, but I am not sure if I will get an answer soon.
I can control the content of the deleted/expired page, but the URL will be very similar to those that are ok (actually the url could has been fine, but now expired).
Thinking of solutions I can set the expired/deleted pages as noindex, would it help to avoid duplicated title/description/content problem? If an user goes to i.e., mywebsite.com/1-article/details.html I can set the head section to noindex if it has expired. Would it be good enough?
Other question, is it possible anyhow to set the pages as 404 without having to do it directly in the .htacess, so avoiding the mod_rewrite problems that I am having? Some magical tag in the head section of the page?
Many thanks in advance for your help,
Best Regards,
Daniel
-
The pages should not show up at all once they are de-indexed.
-
Hi Takeshi, thanks for the asnwer again.
Would it prevent the deleted/expired pages to be shown as soft 404 in the Webmaster tools?
-
Ok, sounds like a noindex,follow in the header is the best solution then. That will keep the no-longer-existant pages from being indexed while still preserving any link juice the page may have acquired.
-
Hi Again,
@Takeshi Young: Thanks for your answer.
I will try to explain what is happening a little better.
We are using a CMS for Classifieds adds. The script is able to give "SEO Friendly" URLs, which are based in mode_rewrite. If a listing has an ID number, lets say "5", that listings url will look like this:
http://mydomain.com/5-listingname/details.html
After the listing expires, the URL will not be valid anymore, and if a user try to visit the listing, the script deliver a page with a message indicating that the lising is not active anylonger. The HTTP Code is 200 "ok". If the listing is deleted, then a user trying to visit the URL will get a similar message, also with a HTTP Code 200. It is a problem, because that page should return a 404 code, indicating the search engine that the page is gone.
If a user try to visit an invalid page, like for example:
http://mydomain.com/invalidpage.html
then the system will deliver the 404 page that is set in the .htaccess file, but since the script recognises the numeric parameter in the deleted/inactive listing, it does not deliver the 404 error but a page with a message, and this page with a message is a soft 404 error, bad for SEO.
It is out of my knowlage to repair the script in order to make it deliver the proper 404 header, but I can customize as much as I want the page indicating the error.
Then I have two questions:
-
If I set the soft 404 error page as noindex, will it be good enough as to not being affected by the problem?
-
Is there any way of indicating the search engine that a page is 404, other than using the apache .htaccess? Like a tag in the head section? or any trick that would help me with this problem?
Thanks in advance for your help,
Daniel
-
-
Why are these parameters an issue for you? Where are they getting linked from? If it's from a high authority external site, it may make sense to 301 redirect them. If they're just low quality sites, it's probably safe to ignore.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to mark as fixed multiple errors on webmaster tools
We have 44,249 errors and I have set up for most of the URLs a 301 redirect. I know exactly which links are correctly redirected my problem is I don't want to mark as fixed each one individually. Is there a way to upload a URL list to webmaster tools and it automatically marks as fixed based on the list.
Technical SEO | | easyoffices0 -
Search pages showing up as soft 404 in WMT
Hi ....we are getting allot of "site search" pages showing up in wmt as soft 404's and wanted to know what the best would be to stop this. All search pages are already noindex follow but maybe we should block them in robots txt as well. Would the below help to solve this ? User-agent: *
Technical SEO | | nomad-202323
Disallow: /?s=
Disallow: /search/ Any other suggestions or direction would be appreciated to prevent these pages showing up as soft 404's tks0 -
Webmaster message - increase in 404 pages
I've had a message in webmaster tools: Increase in “404” pages on http://www.ethicaredental.co.uk/There are 1000 pages in the crawl error list. But all of them direct to t a 404 page, i.e http://www.ethicaredental.co.uk/search?searchword=toothwhich, which as far as I can tell has all the necessary features of a good 404 page (clear message saying the page doesnt exist anymore, navigation, and business details.The webiste was built in Joomla previous to a re-design in Wordpress. Is this a Joomla issue? How can I satisfy webmaster and Googles crawl to understand these are decent 404 pages? Or dop all 1000 pages need to be 301 redirected.??Any thoughts appreciated
Technical SEO | | dentaldesign1 -
Schema Markup Errors - Priority or Not?
Greetings All... I've been digging through the search console on a few of my sites and I've been noticing quite a few structured data errors. Most of the errors are related to: hcard, hentry and hatom. Most of them are missing author & entry-title, while the other one is missing: fn. I recently saw an article on SEL about Google's focus on spammy mark-up. The sites I use are built and managed by vendors, so I would have to impress upon them the impact of these errors and have them prioritize, then fix them. My question is whether or not this should be prioritized? Should I have them correct these errors sooner than later or can I take a phased approach? I haven't noticed any loss in traffic or anything like that, I'm more focused on what negative impact a "phased approach" could have. Any thoughts?
Technical SEO | | AfroSEO0 -
Https and 404 code that goes into htaccess
The 404 error code we put into htaccess files for our websites does not work correctly for our https site. We recently changed one of our http sites to https. When we went to create a 404.html page for it by creating an htaccess folder with the 404 error code in it, once we uploaded the file all of our webpages were displaying incorrectly, as if the css was not attached. The 404 code we used works successfully for our other 404.html pages for our other sites (www.telfordinc.com/404.html). However, it does not work for the https site. Below is the 404 error code we are using for our https site (currently not uploaded until pages display correctly) ErrorDocument 404 /404-error.html RewriteEngine on RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www.)?privatemoneyhardmoneyloan.com/.*$ [NC] RewriteRule .(gif|jpg|js|css)$ - [F] Options +FollowSymLinks RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} !^www.privatemoneyhardmoneyloan.com$ [NC] RewriteRule ^(.*)$ http://www.privatemoneyhardmoneyloan.com/$1 [R=301,L] So we want to know if there is a different 404 error code that goes into the htaccess file for an https vs. http? Appreciate your feedback on this issue
Technical SEO | | Manifestation0 -
Instead of a 301, my client uses a 302 to custom 404
I've found about 900 instances of decommissioned pages being redirected via 302 to a 404 custom page, even when there's a comparable page elsewhere on the site or on a new subdomain. My recommendation would be to always do a 301 from the legacy page to the new page, but since they're are so many instances of this 302->404 it seems to be standard operating procedure by the dev team. Given that at least one of these pages has links coming from 48 root domains, wouldn't it obviously be much better to 301 redirect it to pass along that equity? I don't get why the developers are doing this, and I have to build a strong case about what they're losing with this 302->404 protocol. I'd love to hear your thoughts on WHY the dev team has settled on this solution, in addition to what suffers as a result. I think I know, but would love some more expert input.
Technical SEO | | Jen_Floyd0 -
How to avoid 404 errors when taking a page off?
So... We are running a blog that was supposed to have great content. Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better. In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda. So we decided to restard our blog from zero and make a better try. So. Every page was already ranking in Google. SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors. My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects. Does Google penalyses me for that? It's kinda obvious for me that the answer is YES. Please, help 😉
Technical SEO | | ivan.precisodisso0 -
404 errors on non-existent URLs
Hey guys and gals, First Moz Q&A for me and really looking forward to being part of the community. I hope as my first question this isn't a stupid one but I was just struggling to find any resource that dealt with the issue and am just looking for some general advice. Basically a client has raised a problem with 404 error pages - or the lack thereof- on non-existent URLs on their site; let's say for example: 'greatbeachtowels.com/beach-towels/asdfas' Obviously content never existed on this page so its not like you're saying 'hey, sorry this isn't here anymore'; its more like- 'there was never anything here in the first place'. Currently in this fictitious example typing in 'greatbeachtowels.com/beach-towels/asdfas**'** returns the same content as the 'greatbeachtowels.com/beach-towels' page which I appreciate isn't ideal. What I was wondering is how far do you take this issue- I've seen examples here on the seomoz site where you can edit the URI in a similar manner and it returns the same content as the parent page but with the alternate address. Should 404's be added across all folders on a site in a similar way? How often would this scenario be and issue particularly for internal pages two or three clicks down? I suppose unless someone linked to a page with a misspelled URL... Also would it be worth placing 301 redirects on a small number of common mis-spellings or typos e.g. 'greatbeachtowels.com/beach-towles' to the correct URLs as opposed to just 404s? Many thanks in advance.
Technical SEO | | AJ2340