Pro's & contra's: http vs https
-
Hi there,
We are planning to take the step and go from http to https. The main reason to do this, is to mean trustfull to our clients. And of course the rumours that it would be better for ranking (in the future).
We have a large e-commerce site. A part of this site ia already HTTPS.
I've read a lot of info about pro's and contra's, also this MOZ article: http://moz.com/blog/seo-tips-https-ssl
But i want to know some experience from others who already done this. What did you encountered when changing to HTTPS, did you had ranking drops, or loss of links etc?I want to make a list form pro's and contra's and things we have to do in advance.
Thanx, Leonie
-
We don't use Comscore. Analytics transparently kept tracking everything without any change. We don't use Tagmanager url matching tracking, but unless you have not defined rules which include the url protocol it should not need any attention either.
-
Hi, did you encountered problems with other tools, like Google Analytics and or Tagmanager, Comscore?
Thanx, Leonie
-
We have expensive certificates now for the payed section, i think we'll use the same
i'll ask about the server support SNI, not sure about that, thanx
-
In case you choose the most expensive EV certificates as we did, for whatever is not directly visible, like the cdn serving js, css and images you can just use cheap 8 $/€ certificates.
One thing I forgot, if your server support SNI, don't use it.
We did initially, but soon found out some price engines could not read feeds, moz crawler could not crawl, and everyone on XP+IE was left out. So we disabled it.
-
Hi Max, Thansk, and good to read that you didn't lost ranking. that's my concern and also the backlinking. although you should say with a redirect all the external links i can't control will redirect to https.
We have 2 different ssl certificates now, we are looking for what we need and if we have the right ones.
If i've finished the plan and list i'll think i'll publish it here
Grtz, Leonie
-
I did it a month and half ago for a couple of websites.
Transition was smooth. I had to buy more ssl certificates than I thought for the many domains serving js css and so on... But was not a big hassle.
Just after moving from http to https I didn't notice any ranking change, and to have a good level of accurancy I monitor the same keywords with both moz ranktracker, proranktracker and semrush.
But in fact google is slowly recognizing the move few urls at time, each day you will notice some google serp start serving https url in place of http ones.
After a month we had a big jump in ranking, around +30% more keywords in the top 100 and a general increase in ranking for all the keywords already in top 10, top 30 and top 50.
But I have no idea if it's connected with the shift to https since we also constantly do many other things, get backlinks, improve on-page, etc...
At least it didn't seem to penalize the websites.
-
Hi Pixelbypixel, thanx for your reply.
Right now i'm making a plan for the switch, i'm not in a rush, so i really want to make it all clear before we go, or maybe decide not to..
I don't think most of our clients know what's secure and what isn't. But we want the opportunity to comunicate about this with our clients, something we don't have right now (only when the order something)
The ranking factor, what i read about it, is not a big thing at this moment, but indeed, in the future and can be a bigger one, so that's also a good reason to go.
Thanx for the linked articles!
Grtz, Leonie
-
I'm going to give my opinion more than a list of pros and cons, most people who switch over tend to see a drop in traffic and if you don't ensure you get it all right it can be a nightmare so make sure you've got your plan ready.
Are you sure most "clients" know what https is? Most people outside our world have no idea what it is combine with the fact that the so called ranking boost has yet to be well documented it can be fairly certain its tiny.
Now it is possible that your clients know what it is and will see it and go to your site but in most cases I suspect like the ranking boost other factors would play a bigger role. My advice is to really make sure you have all the bases covered for your transfer. Also wanted to point out in the future it may be a bigger factor.
As for advice on people who have already done it oodles of info here on Moz here are a few -
http://moz.com/community/q/http-to-https-transition-large-drop-in-search-traffic
http://moz.com/community/q/https-sitewide-move-has-resulted-in-huge-rankings-drop
http://moz.com/community/q/authority-site-drastic-ranking-drop-after-google-https-switch-please-help
Obviously people tend to come here for problems more than a shout out for how great it is so don't take that as a massive negative and all of the above is my opinion I'm sure some others will give other opinions and I don't want you to be put off just to be aware that there is a lot to cover in a switch over.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to handle product filter URLs?
I've been researching and can't find a clear cut answer. Imagine you have a product category page e.g. domain/jeans You've a lot of options as to how to filter the results domain/jeans?=ladies,skinny,pink,10 or domain/jeans/ladies-skinny-pink-10 or domain/jeans/ladies/skinny?=pink,10 And in this how do you handle titles, breadcrumbs etc. Is the a way you prefer to handle filters and why do you do it that way? I'm trying to make my mind up as some very big names handle this differently e.g. http://www.next.co.uk/shop/gender-women-category-jeans/colour-pink-fit-skinny-size-10r VS https://www.matalan.co.uk/womens/shop-by-category/jeans?utf8=✓&[facet_filter][meta.tertiary_category][Skinny]=on&[facet_filter][variants.meta.size][Size+10]=on&[facet_filter][meta.master_colour][Midwash]=on&[facet_filter][min_current_price][gte]=6.0&[facet_filter][min_current_price][lte]=18.0&per=36&sort=
Technical SEO | | RodneyRiley0 -
Drupal, http/https, canonicals and Google Search Console
I’m fairly new in an in-house role and am currently rooting around our Drupal website to improve it as a whole. Right now on my radar is our use of http / https, canonicals, and our use of Google Search Console. Initial issues noticed: We serve http and https versions of all our pages Our canonical tags just refer back to the URL it sits on (apparently a default Drupal thing, which is not much use) We don’t actually have https properties added in Search Console/GA I’ve spoken with our IT agency who migrated our old site to the current site, who have recommended forcing all pages to https and setting canonicals to all https pages, which is fine in theory, but I don’t think it’s as simple as this, right? An old Moz post I found talked about running into issues with images/CSS/javascript referencing http – is there anything else to consider, especially from an SEO perspective? I’m assuming that the appropriate certificates are in place, as the secure version of the site works perfectly well. And on the last point – am I safe to assume we have just never tracked any traffic for the secure version of the site? 😞 Thanks John
Technical SEO | | joberts0 -
How google bot see's two the same rel canonicals?
Hi, I have a website where all the original URL's have a rel canonical back to themselves. This is kinda like a fail safe mode. It is because if a parameter occurs, then the URL with the parameter will have a canonical back to the original URL. For example this url: https://www.example.com/something/page/1/ has this canonical: https://www.example.com/something/page/1/ which is the same since it's an original URL This url https://www.example.com/something/page/1/?parameter has this canonical https://www.example.com/something/page/1/ like i said before, parameters have a rel canonical back to their original url's. SO: https://www.example.com/something/page/1/?parameter and this https://www.example.com/something/page/1/ both have the same canonical which is this https://www.example.com/something/page/1/ Im telling you all that because when roger bot tried to crawl my website, it gave back duplicates. This happened because it was reading the canonical (https://www.example.com/something/page/1/) of the original url (https://www.example.com/something/page/1/) and the canonical (https://www.example.com/something/page/1/) of the url with the parameter (https://www.example.com/something/page/1/?parameter) and saw that both were point to the same canonical (https://www.example.com/something/page/1/)... So, i would like to know if google bot treats canonicals the same way. Because if it does then im full of duplicates 😄 thanks.
Technical SEO | | dos06590 -
Proper 301 redirect code for http to https
I see lots of suggestions on the web for forwarding http to https. I've got several existing sites that want to take advantage of the SSL boost for SEO (however slight) and I don't want to lose SEO placements in the process. I can force all pages to be viewed through the SSL - that's no problem. But for SEO reasons, do I need to do a 301 redirect line of code for every page in the site to the new "https" version? Or is there a way to catch all with one line of code that Google, etc. will recognize & honor?
Technical SEO | | wcksmith10 -
What's wrong with this robots.txt
Hi. really struggling with the robots.txt file
Technical SEO | | Leonie-Kramer
this is it: User-agent: *
Disallow: /product/ #old sitemap
Disallow: /media/name.xml When testing in w3c.org everything looks good, testing is okay, but when uploading it to the server, Google webmaster tools gives 3 errors. Checked it with my collegue we both don't know what's wrong. Can someone take a look at this and give me the solution.
Thanx in advance! Leonie1 -
Duplicate Content issue in Magento: The product pages are available true 3 URL's! How can we solve this?
Right now the product page "gedroogde goji bessen" (Dutch for: dried goji berries) is available true 3 URL's! **http://www.sportvoeding.net/gedroogde-goji-bessen ** =>
Technical SEO | | Zanox
By clicking on the product slider on the homepage
http://www.sportvoeding.net/superfood/gedroogde-goji-bessen =>
First go to sportvoeding.net/superfood (main categorie) and than clicking on "gedroogde Goji bessen"
http://www.sportvoeding.net/superfood/goji-bessen/gedroogde-goji-bessen =>
When directly go to the subcategorie "Goji Bessen" true the menu and there clicking on "gedroogde Goji Bessen" We want to have the following product URL:
http://www.sportvoeding.net/superfood/goji-bessen/gedroogde-goji-bessen Does someone know´s a good Exetension for this issue?0 -
Robots.txt crawling URL's we dont want it to
Hello We run a number of websites and underneath them we have testing websites (sub-domains), on those sites we have robots.txt disallowing everything. When I logged into MOZ this morning I could see the MOZ spider had crawled our test sites even though we have said not to. Does anyone have an ideas how we can stop this happening?
Technical SEO | | ShearingsGroup0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0