Htaccess file
-
I need to redirect the web pages which do not exist to 404 error
the task need to be done in htaccess file.
I am using Linux server.
the webpages I want to redirect is my domain name followed by question mark
I am using the following snippet in my htaccess file, it redirect to bing.com so far,
please tell me how to change the snippet so that it redirect to redirect to 404 error page.
==========================
RewriteCond %{QUERY_STRING} .
RewriteRule .* http://www.bing.com? [L,R]
-
Use:
ErrorDocument 404 /404.html
where 404.html is your error document page! If the page does not exist then it should return a 404 status code and then automatically redirect to this page.
-
The reason your URL is redirecting to bing.com is you haven't changed the URL from the snippet you found to the URL you actually want to use as your 404 Error Page.
You can create a 404 Error Page and place it on your server. If you have already done this you need to change the URL (http://www.bing.com) in the snippet below to the URL you want to use (e.g. http://www.[yourdomainname].com/404.html).
RewriteCond %{QUERY_STRING} .
RewriteRule .* http://www.bing.com? [L,R]
That should look more like this:
RewriteCond %{QUERY_STRING} .
RewriteRule .* http://www.[yourdomainname].com/404.html [L,R=301]
-
I am not completely sure I get the question -
two ways I interpreted - the easiest would just be the redirect /? to 404
which would be
redirect 301 /? http://www.404pagehere.com
If you are trying to do a wildcard redirect that is a little harder
RedirectMatch 301 /?/(.*)$ http://www.404pagehere.com
I think the easiest might be
RewriteEngine On
RewriteRule ^foldername/* http://www.pagehere.com/ [R=301,L]Make sure and backup .htaccess before making any changes, as I am not sure what I gave you will fix your specific issue.
Shane
-
This should help you:
http://webdesignandsuch.com/create-a-htaccess-file-to-redirect-to-a-404-html-page/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Client wants to repackage in-depth content as PowerPoint files and embed on site. SEO implications?
Hi, I've a client who is planning to build out "courses" for their site. Their ultimate goal is to have videos (which will have transcriptions) but since the videos are not yet ready they want to launch with the content in PowerPoint format instead. Thing is, the pages they have now are really good content/in-depth. In short it seems videos are Phase 2, so their Phase 1 preference is to take all their courses content and put them in PowerPoint slides and add them to their web site. While I understand standalone files like PDFs and PPTs can be indexable, my recollection is that embedded slides are not (like SlideShare). Is that correct? My worry is that by taking this content and reformatting it into PowerPoints will hurt their site instead of helping. Any insight is appreciated!
Technical SEO | | CR-SEO0 -
Multiple robots.txt files on server
Hi! I have previously hired a developer to put up my site and noticed afterwards that he did not know much about SEO. This lead me to starting to learn myself and applying some changes step by step. One of the things I am currently doing is inserting sitemap reference in robots.txt file (which was not there before). But just now when I wanted to upload the file via FTP to my server I found multiple ones - in different sizes - and I dont know what to do with them? Can I remove them? I have downloaded and opened them and they seem to be 2 textfiles and 2 dupplicates. Names: robots.txt (original dupplicate)
Technical SEO | | mjukhud
robots.txt-Original (original)
robots.txt-NEW (other content)
robots.txt-Working (other content dupplicate) Would really appreciate help and expertise suggestions. Thanks!0 -
Is having no robots.txt file the same as having one and allowing all agents?
The site I am working on currently has no robots.txt file. However, I have just uploaded a sitemap and would like to point the robots.txt file to it. Once I upload the robots.txt file, if I allow access to all agents, is this the same as when the site had no robots.txt file at all; do I need to specify crawler access on can the robots.txt file just contain the link to the sitemap?
Technical SEO | | pugh0 -
How can I make Google Webmaster Tools see the robots.txt file when I am doing a .htacces redirec?
We are moving a site to a new domain. I have setup an .htaccess file and it is working fine. My problem is that Google Webmaster tools now says it cannot access the robots.txt file on the old site. How can I make it still see the robots.txt file when the .htaccess is doing a full site redirect? .htaccess currently has: Options +FollowSymLinks -MultiViews
Technical SEO | | RalphinAZ
RewriteEngine on
RewriteCond %{HTTP_HOST} ^(www.)?michaelswilderhr.com$ [NC]
RewriteRule ^ http://www.s2esolutions.com/ [R=301,L] Google webmaster tools is reporting: Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.0 -
Need Help writing 301 redirects in .htaccess file
SEOmoz tool shows me 2 errors for duplicate content pages (www.abc.com and www.abc.com/index.html). I believe, the solution to this is writing 301 redirects I need two 301 redirects 1. abc.com to www.abc.com 2. /index.html to / (which is www.abc.com/index.html to www.abc.com) The code that I currently have is ................................................... RewriteEngine On
Technical SEO | | WebsiteEditor
RewriteCond %{HTTP_HOST} ^abc.com
RewriteRule (.*) http://www.abc.com/$1 [R=301,L] Redirect 301 http://www.abc.com/index.html http://www.abc.com ...................................................... but this does not redirect /index.html to abc.com. What is wrong here? Please help.0 -
Blocked by meta-robots but there is no robots file
OK, I'm a little frustred here. I've waited a week for the next weekly index to take place after changing the privacy setting in a wordpress website so Google can index, but I still got the same problem. Blocked by meta-robots, no index, no follow. But I do not see a robot file anywhere and the privacy setting in this Wordpress site is set to allow search engines to index this site. Website is www.marketalert.ca What am I missing here? Why can't I index the rest of the website and is there a faster way to test this rather than wait another week just to find out it didn't work again?
Technical SEO | | Twinbytes0 -
File from godaddy.com
Hi, One of our client has received a file from godaddy.com where his site is hosted. Here is the message from the client- "i submitted my site for Search Engine Visibility,but they got some issue on the site need to be fixed. i tried myself could not fix it" The site in question is - http://allkindofessays.com/ Is there any problem with the site ? Contents of the file - bplist00Ó k 0_ WebSubframeArchives_ WebSubresources_ WebMainResource L x Ï Ö Ý ] ¨ ¯ ¼ Û 6 SÓ @ F¡ Ó / :¡ Ó )¡ Ò ¡ Ô _ WebResourceResponse_ WebResourceData_ WebResourceMIMEType^WebResourceURLO cbplist00Ô Z[X$versionX$objectsY$archiverT$top † ¯ "()0 12DEFGHIJKLMNOPTUU$nullÝ !R$6S$10R$2R$7R$3S$11R$8V$classR$4R$9R$0R$5R$1€ € € € € € € € Ó #$%& [NS.relativeWNS.base€ € € _ ¢http://tags.bluekai.com/site/2748?redir=http%3A%2F%2Fsegment-pixel.invitemedia.com%2Fset_partner_uid%3FpartnerID%3D84%26partnerUID%3D%24_BK_UUID%26sscs_active%3D1Ò*+,-Z$classnameX$classesUNSURL¢./UNSURLXNSObject#A´ þ¹ –5 ÈÓ 3456=WNS.keysZNS.objects€ ¦789:;<€ €€ € €€ ¦>?@ABC€ € € € € € \Content-TypeSP3PVServerTDate^Content-LengthYBK-ServerYimage/gif_ nCP="NOI DSP COR CUR ADMo DEVo PSAo PSDo OUR SAMo BUS UNI NAV", policyref="http://tags.bluekai.com/w3c/p3p.xml"_ Apache/2.2.3 (CentOS)_ Sat, 10 Sep 2011 20:23:21 GMTR62T87dfÒ*+QR_ NSMutableDictionary£QS/\NSDictionary >Ò*+VW_ NSHTTPURLResponse£XY/_ NSHTTPURLResponse]NSURLResponse_ NSKeyedArchiverÑ]_ WebResourceResponse€ # - 2 7 R X s v z } € ƒ ‡ Š ‘ ” — š ¢ ¤ ¦ ¨ ª ¬ ® ° ² ´ ¶ ¸ ¿ Ë Ó Õ × Ù ~ ƒ Ž — ¦ ¯ ¸ º Á É Ô Ö Ý ß á ã å ç é ð ò ô ö ø ú ü ( 2 < Å å è í ò 4 8 L Z l o … ^ ‡O >GIF89a ÿÿÿ!ÿ NETSCAPE2.0 !ù , L ;Yimage/gif_ ¢http://tags.bluekai.com/site/2748?redir=http%3A%2F%2Fsegment-pixel.invitemedia.com%2Fset_partner_uid%3FpartnerID%3D84%26partnerUID%3D%24_BK_UUID%26sscs_active%3D1Õ _ WebResourceTextEncodingName_ WebResourceFrameNameO 6
Technical SEO | | seoug_20050 -
Dynamically-generated .PDF files, instead of normal pages, indexed by and ranking in Google
Hi, I come across a tough problem. I am working on an online-store website which contains the functionlaity of viewing products details in .PDF format (by the way, the website is built on Joomla CMS), now when I search my site's name in Google, the SERP simply displays my .PDF files in the first couple positions (shown in normal .PDF files format: [PDF]...)and I cannot find the normal pages there on SERP #1 unless I search the full site domain in Google. I really don't want this! Would you please tell me how to figure the problem out and solve it. I can actually remove the corresponding component (Virtuemart) that are in charge of generating the .PDF files. Now I am trying to redirect all the .PDF pages ranking in Google to a 404 page and remove the functionality, I plan to regenerate a sitemap of my site and submit it to Google, will it be working for me? I really appreciate that if you could help solve this problem. Thanks very much. Sincerely SEOmoz Pro Member
Technical SEO | | fugu0