Adding Something to htaccess File
-
When I did a google search for site.kisswedding.com (my website) I noticed that google is indexing all of the https versions of my site.
First of all, I don't get it because I don't have an SSL certificate. Then, last night I did what my host (bluehost) told me to do. I added the below to my htaccess file.
Below rule because google is indexing https version of site - https://my.bluehost.com/cgi/help/758RewriteEngine OnRewriteCond %{HTTP_HOST} ^kisswedding.com$ [OR]RewriteCond %{HTTP_HOST} ^kisswedding.com$RewriteCond %{SERVER_PORT} ^443$RewriteRule ^(.*)$ http://www.kisswedding.com [R=301,L]
Tonight I when I did a google search for site:kisswedding.com all of those https pages were being redirected to my home page - not the actually page they're supposed to be redirecting to.
I went back to Bluehost and they said and 301 redirect shouldn't work because I don't have an SSL certificate. BUT, I figure since it's sorta working I just need to add something to that htaccess rule to make sure it's redirected to the right page.
Someone in the google webmaster tools forums told me to do below but I don't really get it?
_"to 301 redirect from /~kisswedd/ to the proper root folder you can put this in the root folder .htaccess file as well:_Redirect 301 /~kisswedd/ http://www.kisswedding.com/"
Any help/advice would be HUGELY appreciated. I'm a bit at a loss.
-
Hi Susanna,
Just wanted to post a quick update on this question since I understand the problem was resolved with the suggestions that we made after looking at the content of your .htaccess file.
The major issue with the original redirect was that the $1 had been omitted from the Rule. So the problem actually was not related to the secure protocol - just a coding error
Example code for the secure to non-secure redirect you needed to implement:
redirect away from https to http
RewriteEngine On
RewriteCond %{SERVER_PORT} ^443$ [NC]
RewriteRule ^(.*)$ http://www.yourdomain.com$1 [R=301,L]Hope that's helpful for anyone who may be looking for help with the same issue in the future.
Sha
-
Hi Susanna,
If you have trouble getting someone to fix the .htaccess for you, feel free to private message the file and we'll be able to sort it out for you.
It is never advisable to treat rules in the .htaccess in isolation as the order in which they appear in the file will determine how things work as Ryan explained. There are also other things that will influence whether the redirects function the way you want them to such as rules added to the file or overwritten by standard installations such as Wordpress, Joomla etc.
Hope your provider can get it sorted out for you. If not, just let us know and we'll be happy to help.
Sha
-
Thanks Ryan. I appreciate that. I had no idea that copy and pasting could cause so many problems. I'll try and see if I can speak with a supervisor.
Have a great night/day!
-
Hi Susan.
I will share two items regarding your situation. First, the modifications to your htaccess file involve a replacement language called Regex. The code you are copying and pasting are regex expressions. Basically they say "when a URL meets this condition, rewrite the URL as follows....". In your case, the original regex expression was not correctly written (a common occurrence) so you did not receive the desired effect.
The changes need to be made in your .htaccess file. The htaccess file controls all access to your website. A single character out of place can mean your entire site is unavailable, URLs are improperly redirected, or security holes are open on your site. On the one hand, you can copy and paste code into your htaccess file and it may work. On the other hand damage can be done. It also makes a difference where you locate the code within the file. Varying the location alters the logic and can lead to different results.
Based on the above it is my recommendation your host make the changes. If your host is unwilling to help, I would recommend asking to be assigned to another tech or a supervisor. Most hosts are very helpful in this area. If the host is not willing to help, perhaps you can ask your web developer to make the change.
If you decide to make the change yourself I recommend you doing some online research into how the htaccess file works. You do not want to fly blind in this file.
Suggested reading: http://net.tutsplus.com/tutorials/other/the-ultimate-guide-to-htaccess-files/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Losing referrer data on http link that redirects to an https site when on an https site. Is this typical or is something else going on here?
I am trying to resolve a referral data issue. Our client noticed that their referrals from one of their sites to another had dropped to almost nothing from being their top referrer. The referring site SiteA which is an HTTPs site, held a link to SiteB, which is also an HTTPs site, so there should be no loss, however the link to SiteB on SiteA had the HTTP protocol. When we changed the link to the HTTPs protocol, the referrals started flowing in. Is this typical? If the 301 redirect is properly in place for SiteB, why would we lose the referral data?
Reporting & Analytics | | Velir0 -
Transitioning to HTTPS, Do i have to submit another disavow file?
Hi Mozzers! So we're finally ready to move from http to https. Penguin is finally showing us some love due to the recent algorithm updates. I just added the https to Google search console and before 301 redirecting the whole site to a secure environment.....do i upload the same disavow file to the https version? Moreover, is it best to have both the http and https versions on Google Search console? Or should I add the disavow file to the https version and delete the http version in a month? And what about Bing? Help.
Reporting & Analytics | | Shawn1240 -
How to FILTER in Google Analytics an ad campaign from linkedin?
Hi mozzers We are setting up an a linkedin ad campaign for our agency and want to track its traffic and conversions. The linkedin ad will carry UTMs for each link. For tracking this campaign accurately I thought about creating a new GA View with a specific filter. So my question is about the filtering, should i use the INCLUDE, REFERRAL with pattern LINKEDIN.COM (see image)? if not what would be the best way to track this campaign? My other concern is that we are also running other a job ad on linkedin and I feel these hits will be tracked as well. Is there a way to separate those 2 campaigns? Thanks guys! MzE5hqE.png
Reporting & Analytics | | Ideas-Money-Art0 -
Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
Greetings MOZ Community: On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851. The following changes occurred between June 5th and June 15th: -A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress. -Google GTM code was added to the site. -An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function. In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages. Obviously this is not a good situation. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time. Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this? Thanks everyone!!!
Reporting & Analytics | | Kingalan1
Alan1 -
Organic Traffic Drop and Rank Increase After Video Thumbnail added
So my company has created a large amount of videos and I took a couple of them and created a test video sitemap to see what effect adding a video thumbnail/rich snippet to the SERPs would be. It worked on one page and gained 2 spots (position 6 to 4) for the highest keyword, but traffic didn't increase too much. Then a week later I tested it with the page that we get the most organic traffic for, which is ranking for a very big keyword. It worked and gained a bit of ranking, but traffic decreased 50% ever since according to Google Analytics. It seems to me that the traffic from users clicking on the video thumbnail is not be tracked as google / organic even though it lands them on the intended page/doesn't redirect anywhere else. I've looked to see if traffic to this page increased overall to see if it was being tracked via a referral or as something else, but couldn't find any traffic discrepancy. The only thing I did find is that impressions under SEO > Landing Pages > Video Property increased, but this could be from the page ranking in the Video SERPs now. Has anyone experienced a similar situation? Do you think having a video snippet could be that big of a turn off for customers that people just aren't clicking like they used to? I don't think so, that's why I'm leaning towards a tracking discrepancy in Google Analytics.
Reporting & Analytics | | OfficeFurn0 -
Robots.txt file issue.
Hi, Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help. Robots.txt blocked 2k important URL's of my blogging site http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350. I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool. What Can I do know to have these blocked URL's back in Google index? 1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool? 2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's. I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
Reporting & Analytics | | csfarnsworth0 -
How to get a list of robots.txt file
This is my site. http://muslim-academy.com/ Its in wordpress.I just want to know is there any way I can get the list of blocked URL by Robots.txt In Google Webmaster its not showing up.Just giving the number of blocked URL's. Any plugin or Software to extract the list of blocked URL's.
Reporting & Analytics | | csfarnsworth0