404-like content
-
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages.
the following page is an example:
http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics
This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas?
To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping:
All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter.
The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page.
Like I said, the response codes are setup correctly, as far as Firebug can tell me.
any thoughts would be most appreciated.
-
Scott, if you fix the problem by using the global.ax fiile, remeber to the make sure that the 404 page does then retuurn a 404.
-
i think how google does detects a soft 404 is like this.
http://www.professionalindemnitynow.com/gobblygook should return a 404, but returns a 200, so they now know that you site is prone to soft 404's
but how do they then decide what pages on the site are and what are not s404's is not clear. From reading, my best understanding is that they then look for simularities, to the know s404, such as timings, and other criteria. -
This is the point. it should return a 404, but instead returns a 200, this is what is called a soft 404.
See my other comment on how to fix. -
The page we are discussing is not listed in the image you shared.
I checked one link which is listed: http://www.professionalindemnitynow.com/business-consultants-quote
The top of the page says "Error - The page you have tried to access cannot be found"
While the page returns a 200 header code, Google is likely seeing the page header text and recognizing it as a "404-like" page as they shared.
-
You could try using either the global.asx file or a http model to do the rewiring, global.asx would be the easiest.
from memory the begin_request event would be the one to use.
the thing is you need to do the rewriting earlier in the event cycle.
-
Thanks Yannick. Completely agree with the content of the page using the keywords too frequently. This is the site owner claiming to "understand" SEO! I will advise him that he needs to calm down the keyword stuffing.
I'm going to add the page, and other similar landing pages that are used for Adwords, to the public sitemap
-
The reason I refer to it as a soft 404, is the listing within webmaster tools. See attached image for more examples.
You're right - it is not on the sitemap which I need to address, but still dont see why Google detect this as a 404 when it clearly 200's.
Thanks for your response.
-
Hi Scott.
I am confused why you refer to the link you shared as a soft-404. http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics. The page title is "Medical Malpractice Insurance for Clinics" which is a perfect match for the URL. The page returns a 200 response header code. By all counts this appears as the proper page which should be returned and not a 404 in any way.
If you have a 404 error log file which shows this page as a 404 error, that issue is completely internal to your site. From the perspective of Google and the rest of the world your site is working perfectly. If the only place the page shows as a 404 is your log file, you want to check with a developer to determine exactly what is triggering the file entry.
With respect to indexing, I support Yannick's findings.
-
I'd say: the URL isn't accessible via the menu? Can't find it anywhere? I tried looking under http://www.professionalindemnitynow.com/Medical-malpractice-insurance but couldn't find a link to the page. Is the page only located in your sitemap? That might be why it isn't indexed. Link to it (more!)
The other thing is o/c: high keyword density/spammy usage of the keywords you are targetting
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
No Java, No Content..?
Hello Mozers! 🙂 I have a question for you: I am working on a site and while doing an audit I disabled JavaScript via the Web Developer plugin for Chrome. The result is that instead of seeing the page content, I see the typical “loading circle” but nothing else. I imagine this not a good thing but what does this implies technically from crawler perspective? Thanks
Technical SEO | | Pipistrella0 -
If content is at the bottom of the page but the code is at the top, does Google know that the content is at the bottom?
I'm working on creating content for top category pages for an ecommerce site. I can put them under the left hand navigation bar, and that content would be near the top in the code. I can also put the content at the bottom center, where it would look nicer but be at the bottom of the code. What's the better approach? Thanks for reading!
Technical SEO | | DA20130 -
Two Sites with Similar Content
I have a specialized website for hospitals covering a specific topic. This same topic is also applicable to another market but with some minor modifications. I'm thinking about starting a new site to target this specific market and use the same content as the one specialized for healthcare. I will have to make some minor adjustments to the articles to take out the healthcare part and replace with the other industry. If my content is similar between both sites and both authored by me could that possibly hurt my rankings? Any opinions appreciated.
Technical SEO | | MedGroupMedia0 -
Duplicate Content Due to Pagination
Recently our newly designed website has been suffering from a rankings loss. While I am sure there are a number of factors involved, I'd like to no if this scenario could be harmful... Google is showing a number of duplicate content issues within Webmaster Tools. Some of what I am seeing is duplicate Meta Titles and Meta Descriptions for page 1 and page 2 of some of my product category pages. So if a category has many products and has 4 pages, it is effectively showing the same page title and meta desc. across all 4 pages. I am wondering if I should let my site show, say 150 products per page to get them all on one page instead of the current 36 per page. I use the Big Commerce platform. Thank you for taking the time to read my question!
Technical SEO | | josh3300 -
How to avoid duplicate content penalty when our content is posted on other sites too ?
For recruitment company sites, their job ads are posted muliple times on thier own sites and even on other sites too. These are the same ads (job description is same) posted on diff. sites. How do we avoid duplicate content penalty in this case?
Technical SEO | | Personnel_Concept0 -
Duplicate page content
hi I am getting an duplicate content error in SEOMoz on one of my websites it shows http://www.exampledomain.co.uk http://www.exampledomain.co.uk/ http://www.exampledomain.co.uk/index.html how can i fix this? thanks darren
Technical SEO | | Bristolweb0 -
Squarespace Duplicate Content Issues
My site is built through squarespace and when I ran the campaign in SEOmoz...its come up with all these errors saying duplicate content and duplicate page title for my blog portion. I've heard that canonical tags help with this but with squarespace its hard to add code to page level...only site wide is possible. Was curious if there's someone experienced in squarespace and SEO out there that can give some suggestions on how to resolve this problem? thanks
Technical SEO | | cmjolley0