404-like content
-
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages.
the following page is an example:
http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics
This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas?
To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping:
All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter.
The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page.
Like I said, the response codes are setup correctly, as far as Firebug can tell me.
any thoughts would be most appreciated.
-
Scott, if you fix the problem by using the global.ax fiile, remeber to the make sure that the 404 page does then retuurn a 404.
-
i think how google does detects a soft 404 is like this.
http://www.professionalindemnitynow.com/gobblygook should return a 404, but returns a 200, so they now know that you site is prone to soft 404's
but how do they then decide what pages on the site are and what are not s404's is not clear. From reading, my best understanding is that they then look for simularities, to the know s404, such as timings, and other criteria. -
This is the point. it should return a 404, but instead returns a 200, this is what is called a soft 404.
See my other comment on how to fix. -
The page we are discussing is not listed in the image you shared.
I checked one link which is listed: http://www.professionalindemnitynow.com/business-consultants-quote
The top of the page says "Error - The page you have tried to access cannot be found"
While the page returns a 200 header code, Google is likely seeing the page header text and recognizing it as a "404-like" page as they shared.
-
You could try using either the global.asx file or a http model to do the rewiring, global.asx would be the easiest.
from memory the begin_request event would be the one to use.
the thing is you need to do the rewriting earlier in the event cycle.
-
Thanks Yannick. Completely agree with the content of the page using the keywords too frequently. This is the site owner claiming to "understand" SEO! I will advise him that he needs to calm down the keyword stuffing.
I'm going to add the page, and other similar landing pages that are used for Adwords, to the public sitemap
-
The reason I refer to it as a soft 404, is the listing within webmaster tools. See attached image for more examples.
You're right - it is not on the sitemap which I need to address, but still dont see why Google detect this as a 404 when it clearly 200's.
Thanks for your response.
-
Hi Scott.
I am confused why you refer to the link you shared as a soft-404. http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics. The page title is "Medical Malpractice Insurance for Clinics" which is a perfect match for the URL. The page returns a 200 response header code. By all counts this appears as the proper page which should be returned and not a 404 in any way.
If you have a 404 error log file which shows this page as a 404 error, that issue is completely internal to your site. From the perspective of Google and the rest of the world your site is working perfectly. If the only place the page shows as a 404 is your log file, you want to check with a developer to determine exactly what is triggering the file entry.
With respect to indexing, I support Yannick's findings.
-
I'd say: the URL isn't accessible via the menu? Can't find it anywhere? I tried looking under http://www.professionalindemnitynow.com/Medical-malpractice-insurance but couldn't find a link to the page. Is the page only located in your sitemap? That might be why it isn't indexed. Link to it (more!)
The other thing is o/c: high keyword density/spammy usage of the keywords you are targetting
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Managing Zendesk content, specifically redirecting/retiring content?
We've been using Zendesk to manage support content, and have old/duplicate articles that we'd like to redirect. However, Zendesk doesn't seem to have a solution for this, and the suggestions we've found (some hacky JS) have not worked. I'd like for us to not just delete/hide these articles. Has anyone else successfully navigated retiring/redirecting Zendesk content in an SEO-friendly fashion?
Technical SEO | | KMStrava0 -
Duplicate content issue
Moz crawl diagnostic tool is giving me a heap of duplicate content for each event on my website... http://www.ticketarena.co.uk/events/Mint-Festival-7/ http://www.ticketarena.co.uk/events/Mint-Festival-7/index.html Should i use a 301 redirect on the second link? i was unaware that this was classed as duplicate content. I thought it was just the way the CMS system was set up? Can anyone shed any light on this please. Thanks
Technical SEO | | Alexogilvie0 -
Term for how content or data is structured
There is a term for how data or content is structured and for the life of me I can't figure it out. The following is the best I know of how to explain it: magnolia is of Seattle. Seattle is of Washington. Washington is of the US. US is of North America. North America is of Earth. etc etc etc etc. Any help is much appreciated. I'm trying to use the term to communicate It's application to SEO in that Google analyze how information is structured to understand the breadth and depth of your sites content...
Technical SEO | | BonsaiMediaGroup0 -
How to avoid duplicate content penalty when our content is posted on other sites too ?
For recruitment company sites, their job ads are posted muliple times on thier own sites and even on other sites too. These are the same ads (job description is same) posted on diff. sites. How do we avoid duplicate content penalty in this case?
Technical SEO | | Personnel_Concept0 -
Mass 404 Checker?
Hi all, I'm currently looking after a collection of old newspaper sites that have had various developments during their time. The problem is there are so many 404 pages all over the place and the sites are bleeding link juice everywhere so I'm looking for a tool where I can check a lot of URLs at once. For example from an OSE report I have done a random sampling of the target URLs and some of them 404 (eek!) but there are too many to check manually to know which ones are still live and which ones have 404'd or are redirecting. Is there a tool anyone uses for this or a way one of the SEOMoz tools can do this? Also I've asked a few people personally how to check this and they've suggested Xenu, Xenu won't work as it only checks current site navigation. Thanks in advance!
Technical SEO | | thisisOllie0 -
An odd duplicate content issue...
Hi all, my developers have just assured me that nothing has changed form last week but in the today's crawl I see all the website duplicated: and the difference on the url is the '/' so basically the duplicated urls are: htts://blabla.bla/crop htts://blabla.bla/crop/ Any help in understanding why is much appreciated. thanks
Technical SEO | | LeadGenerator0 -
Duplicate Content - Mobile Site
We think that a mobile version of our site is causing a duplicate content issue; what's the best way to stop the mobile version being indexed. Basically the site forwards mobile users to "/mobile" which is just a mobile optimised version of the original site. Is it best to block the /mobile folder from being crawled?
Technical SEO | | nsmith7870 -
Adding more content to an old site
We have a site which was de-moted from PR4 to PR3 with the latest Google update. We have not done any SEO for a long time for the site and the content is the same with over 100 page. My question is, in order to update the site, which is the best to do it, do we: 1. re-introduced new content to replace old once 2. re-write old content 3. Add new pages Many thanks in advance.
Technical SEO | | seomagnet0