What if 404 Error not possible?
-
Hi Everyone,
I get an 404 error in my page if the URL is simply wrong, but for some parameters, like if a page has been deleted, or has expired, I get an error page indicating that the ID is wrong, but no 404 error.
It is for me very difficult to program a function in php that solve the problem and modify the .htaccess with the mod_rewrite. I ask the developer of the system to give a look, but I am not sure if I will get an answer soon.
I can control the content of the deleted/expired page, but the URL will be very similar to those that are ok (actually the url could has been fine, but now expired).
Thinking of solutions I can set the expired/deleted pages as noindex, would it help to avoid duplicated title/description/content problem? If an user goes to i.e., mywebsite.com/1-article/details.html I can set the head section to noindex if it has expired. Would it be good enough?
Other question, is it possible anyhow to set the pages as 404 without having to do it directly in the .htacess, so avoiding the mod_rewrite problems that I am having? Some magical tag in the head section of the page?
Many thanks in advance for your help,
Best Regards,
Daniel
-
The pages should not show up at all once they are de-indexed.
-
Hi Takeshi, thanks for the asnwer again.
Would it prevent the deleted/expired pages to be shown as soft 404 in the Webmaster tools?
-
Ok, sounds like a noindex,follow in the header is the best solution then. That will keep the no-longer-existant pages from being indexed while still preserving any link juice the page may have acquired.
-
Hi Again,
@Takeshi Young: Thanks for your answer.
I will try to explain what is happening a little better.
We are using a CMS for Classifieds adds. The script is able to give "SEO Friendly" URLs, which are based in mode_rewrite. If a listing has an ID number, lets say "5", that listings url will look like this:
http://mydomain.com/5-listingname/details.html
After the listing expires, the URL will not be valid anymore, and if a user try to visit the listing, the script deliver a page with a message indicating that the lising is not active anylonger. The HTTP Code is 200 "ok". If the listing is deleted, then a user trying to visit the URL will get a similar message, also with a HTTP Code 200. It is a problem, because that page should return a 404 code, indicating the search engine that the page is gone.
If a user try to visit an invalid page, like for example:
http://mydomain.com/invalidpage.html
then the system will deliver the 404 page that is set in the .htaccess file, but since the script recognises the numeric parameter in the deleted/inactive listing, it does not deliver the 404 error but a page with a message, and this page with a message is a soft 404 error, bad for SEO.
It is out of my knowlage to repair the script in order to make it deliver the proper 404 header, but I can customize as much as I want the page indicating the error.
Then I have two questions:
-
If I set the soft 404 error page as noindex, will it be good enough as to not being affected by the problem?
-
Is there any way of indicating the search engine that a page is 404, other than using the apache .htaccess? Like a tag in the head section? or any trick that would help me with this problem?
Thanks in advance for your help,
Daniel
-
-
Why are these parameters an issue for you? Where are they getting linked from? If it's from a high authority external site, it may make sense to 301 redirect them. If they're just low quality sites, it's probably safe to ignore.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googlebot crawl error Javascript method is not defined
Hi All, I have this problem, that has been a pain in the ****. I get tons of crawl errors from "Googlebot" saying a specific Javascript method does not exist in my logs. I then go to the affected page and test in a web browser and the page works without any Javascript errors. Can some help with resolving this issue? Thanks in advance.
Technical SEO | | FreddyKgapza0 -
Hreflang Tags - error: 'en' - no return tags
Hello, We have recently implemented Hreflang tags to improve the findability of our content in each specific language. However, Webmaster tool is giving us this error... Does anyone know what it means and how to solve it? Here I attach a screenshot: http://screencast.com/t/a4AsqLNtF6J Thanks for your help!
Technical SEO | | Kilgray0 -
Homepage De-Indexed - No Errors, No Warnings
Hi, I am currently working on this project. Sometime between March 7th & 8th homepage was de-indexed. The rest of the pages are there. Found it out through decreased traffic on GA. No notifications of any kind of penalty/errors recieved. Tried to manually re-index through "Fetch as Google" in WMT to no avail. Site is redirected to https. Any suggestions would be highly appreciated. Thank you in advance.
Technical SEO | | gpapatheodorou0 -
Weird, long URLS returning crawl error
Hi everyone, I'm getting a crawl error "URL too long" for some really strange urls that I'm not sure where they are being generated from or how to resolve it. It's all with one page, our request info. Here are some examples: http://studyabroad.bridge.edu/request-info/?program=request info > ?program=request info > ?program=request info > ?program=request info > ?program=programs > ?country=country?type=internships&term=short%25 http://studyabroad.bridge.edu/request-info/?program=request info > ?program=blog > notes from the field tefl student elaina h in chile > ?utm_source=newsletter&utm_medium=article&utm_campaign=notes%2Bfrom%2Bthe%2Bf Has anyone seen anything like this before or have an idea of what may be causing it? Thanks so much!
Technical SEO | | Bridge_Education_Group0 -
Error on Magento database 301 bulk update
Hi all, One of my client has a magento website and I recently received received 404 errors for about 600 links on GWT and I tried to give 301 redirection via bulk upload but i get errors. It's magento 1.7 and I have following columns on csv file. I included first sample row as well. <colgroup><col width="120"><col width="71"><col width="120"><col width="402"><col width="253"><col width="120"><col width="120"><col width="120"><col width="120"><col width="120"></colgroup>
Technical SEO | | sedamiran
| url_rewrite_id | store_id | id_path | request_path | target_path | is_system | options | description | category_id | product_id |
| 125463 | 1 | 22342342_54335 | old_link | new_link | 0 | RP | NULL | NULL | NULL | | | | | | | | | | | | The error msg I receive is below. I was wondering if anyone has tried this before and if you know you how to fix this. Manual redirection works fine but probably this first 600 error is just a start, I'll be getting more 404 errors soon, somehow i need to figure out how to fix this. I appreciate if any one has experience on this and guide me through. Thanks in advance, Here is the error: SQL query: INSERT INTO 'mgn_core_url_rewrite'
VALUES ( 'url_rewrite_id', 'store_id', 'id_path', 'request_path', 'target_path', 'is_system', 'options', 'description', 'category_id', 'product_id' )MySQL said: #1452 - Cannot add or update a child row: a foreign key constraint fails ('ayb_mgn2'.'mgn_core_url_rewrite', CONSTRAINT 'FK_101C92B9EEB71CACE176D24D46653EBA' FOREIGN KEY ('category_id') REFERENCES 'mgn_catalog_category_entity' ('entity_id') ON DELETE CASCADE ON) <colgroup><col width="120"><col width="71"><col width="120"><col width="402"><col width="253"><col width="120"><col width="120"><col width="120"><col width="120"><col width="120"></colgroup>
| | | | | | | | | | |1 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
Why is 4XX (Client Error) shown for valid pages?
My Crawl Diagnostics Summary says I have 5,141 errors of the 4XX (Client Error) variety. Yet when I view the list of URLs they all resolve to valid pages. Here is an example.
Technical SEO | | jimaycock
http://www.ryderfleetproducts.com/ryder/af/ryder/core/content/product/srm/key/ACO 3018/pn/Wiper-Blade-Winter-18-Each/erm/productDetail.do These pages are all dynamically created from search or browse using a database where we offer 36,000 products. Can someone help me understand why these are errors.0 -
Why are my pages getting duplicate content errors?
Studying the Duplicate Page Content report reveals that all (or many) of my pages are getting flagged as having duplicate content because the crawler thinks there are two versions of the same page: http://www.mapsalive.com/Features/audio.aspx http://www.mapsalive.com/Features/Audio.aspx The only difference is the capitalization. We don't have two versions of the page so I don't understand what I'm missing or how to correct this. Anyone have any thoughts for what to look for?
Technical SEO | | jkenyon0