Confused About Problems Regarding Adding an SSL
-
After reading Cyrus' article: http://moz.com/blog/seo-tips-https-ssl, I am now completely confused about what adding SSL could do to our site. Bluehost, our hosting provider, says if we get their SSL, they just add it to our site and it's up in a few hours: no problem whatsoever. If that's true, that'd be fantastic...however, if that's true, there wouldn't need to be like 10 things you're supposed to do (according to Cyrus' article) to ensure your rankings after the switch.
Can someone clarify this for me?
Thanks,
Ruben
-
Thanks Cyrus!
-
Hi Ruben,
Thanks for writing in. I'm unfamiliar with Bluehost's HTTPs service, but I assume they are taking care of top level issues. You'll still want to go through the checklist to make sure everything is valid and you follow SEO best practices.In short:
- Check your links
- Check your assets (images, CSS, javascript)
- Canonical tags
- Register with Google Webmaster Tools
- Update your sitemaps and robots.txt files
This covers the important stuff. As you noted, a few more tips here: http://moz.com/blog/seo-tips-https-ssl
-
Maybe was obvious to everybody but 301 redirect for every single page is also a fundamental step, otherwise you are going to have broken external links, not to mention WMT which I don't think would be satisfied by just the canonical update.
Sitemap must be updated as well.
We recently switched a website from HTTP to HTTPS and in term of performance there was no difference after the update, at least according to WMT and analytics.
I was kind of scared before to update but at the end everything was smoother than expected, WMT took around 10 days to completely re-index the https version.
But of course we kept finding some non https link embedded here and there in some pages for days and we had to manually edit some content to avoid ssl warning from browsers.
-
I have no idea what CMS you are using but check the server side code generating the link, not just the code sent to the browser.
We recently switched to SSL, and our CMS was already building internal links on pages using the protocol of the http request.
-
Thanks Highland!
-
Great, thanks!
-
Ruben, I had a look at your website and your URLs all have HTTP in them so these would need to be updated all across your site before you make the switch to HTTPS. Because you are using WordPress this should be as simple as updating the site URL to https://www.kempruge.com.
The tip by @Highland about using Firebug is excellent. This will allow you to quickly debug if there are non-HTTPS links remaining - in the WordPress theme or template, for example.
Have a look at the WordPress HTTPS documentation also.
-
Hi Alex,
I'm not really sure if we use a protocol-less linking pattern or not. I don't see http:// in any of our urls, so if that's the criteria I'm guessing we don't? I included a screenshot of one of our URLs. Would you mind telling me if it's clear from the image whether we do or do not?
Thanks for your response. I really appreciate your time and input.
Best,
Ruben
-
One major tip I always point people to is that using protocol-less links for anything external is a great way to make sure your site always supports SSL without issue.
Firebug is a great way to make sure everything is loading HTTPS. Turn it on, switch to the Net tab, and load your page. It will show you every request sent as part of your page. It makes spotting non-SSL requests easy.
You can turn HSTS on yourself if your provider uses Apache and supports htaccess. (sorry I can't link an article, Moz won't let me). If they don't, you will have to have your host enable it on their end.
-
Implementing SSL should be straightforward for the most part
You need to ensure that links around your site (including canonical links) are updated to use HTTPS (so https://example.com/link as opposed to http://example.com/link where example.com is your domain name). If you are already using a protocol-less linking pattern (//example.com/link) you don't need to update the links.
You can also configure your web server to only serve HTTPS. If your web server is Apache you can do this with the SSLRequireSSL directive.
<code><location>SSLRequireSSL</location></code>
HTTPS also causes a significant slow-down as the browser and the server negotiate a secure connection. If your site has already been optimized for speed it should not cause a problem but if in doubt revisit that process and ensure that you are getting the best possible speed for your visitors.
The article by Cyrus has a great checklist to double check everything.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Confusing mixture of cross-domain and multi-language - HREFLANG
Hi Mozzers, I am working for an international client, in a highly regulated industry. As such, their international set-up is slightly confusing. They currently operate websites across multiple countries (with ccTLDs), as well as a global .com. E.g: domain.co.uk domain.it domain. es domain.com etc. Additionally, they offer multiple languages across each of these domains, which often cross over. E.g: domain.co.uk/en/, domain.co.uk/fr/, domain.co.uk/de/ domain.es/en/, domain.es/es/ domain.it/en/, domain.it/it/ domain.com/en/, domain.com/es/, domain.com/fr/, domain.com/de/ They are not currently using HREFLANG of any sort. Using EN as an example, this results in 6 URLs showing the same content, albeit for different languages/locations: Main URL domain.co.uk/en/category-A/ hreflang="en-GB" Multi-lingual variants from same domain... domain.co.uk/fr/category-A/ hreflang="fr-GB" domain.co.uk/de/category-A/ hreflang="de-GB" Cross domain variants from other ccTLDs... domain.es/en/category-A/ hreflang="en-ES" domain.it/en/category-A/ hreflang="en-IT" domain.com/en/category-A/ hreflang="en" Can anyone cleverer than myself confirm that the above would be the most effective set-up for this scenario, with each URL referencing each other in this way?
Intermediate & Advanced SEO | | Pan12340 -
SPA (angularJS) Ranking Problem
Hello dear experts; I working on SPA website with More than 20,000 indexed pages by Google there is a problem that I involved with: 1- Some pages had a good ranking on google.com but from a week ago, ranks one by one, dropped and homepage Replaced. for example, URL: https://cafegardesh.com/tours-dubai removes and https://cafegardesh.com replaced. Google still see the page when I use Fetch as Google tool. I can't understand what happens Is there anyone can find my site problems?
Intermediate & Advanced SEO | | cafegardesh0 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Schema.org problems (still)
Hey Mozzers, I've been working at this for a while now, and I can't figure out why the rich snippet data is not getting pulled for our reviews and product rating. I've included a sample URL where we have reduced the schema.org markup: http://www.tripcentral.ca/vacations-packages_00_03_JN_gran-bahia-principe-coba.html | } | Any thoughts? I was told not to list multiple reviews, so I took them out. But it's still not being picked up in the SERPs, and we would really like the star rating data to appear. Any useful advice would be appreciated!
Intermediate & Advanced SEO | | tripcentral0 -
Google now automatically adding (NAME).UK.COM to the end of each page title
We're on a .uk.com subdomain (scientifica.uk.com). Now, at the end of each page title, Google is automatically adding 'Scientifica - UK.COM'. So a lot of our results now appear as [Page title] - Scientifica - UK.COM It looks a bit a mess and I'm worried about our CTR. Can anyone shed some light on this or know how to stop it? Thank you WbCO0BM
Intermediate & Advanced SEO | | JamesPearce0 -
Duplicate titles and descriptions problem?
We had an old site that used the urls for items site.com/32423432 we changed that to site.com/item name The old stuff has gone away and we have 301 redirects up. For some reason we are getting hit with duplicate titles on those pages and duplicate meta tags. The site relaunch was in November and we have a had a few problems but this just started showing up in the last week after having gone down. Any thoughts on a fix?
Intermediate & Advanced SEO | | EcommerceSite0 -
Problems with a NoIndex NoFollow Site
For legal reasons my website is going to launch non-branded websites. We do not have the capacity to make these site sufficiently unique from the main site so we are planning on having them be NoIndex NoFollow. Are there any potential SEO problems here? What will the implication be if in ~1-2 years from launching the NoIndex NoFollow we make the site unique, take away the tag and want to start promoting these sites organically. Thanks!
Intermediate & Advanced SEO | | theLotter0 -
What is the time of impact of a link regarding to rankings?
I own a website in a pretty semi-competitive market (220 000 searches a month for my main keyword). I've been doing some intensive linkbuilding with some good results. I got around 10 links from organisations, schools and websites of city halls, all of them, the pages being at least pagerank 3 or 4. I let some time pass inbetween, to let Google craw the pages I got the links from and most of them also start to appear in my GWT. The thing is, my rankings havn't improved anything, they are doing quite some Google dancing, staying around position 50. I got the links about 2 months ago (April). When checking other websites in my market, they all have fewer links and mostly low quality. My website itself is also pretty good, all unique content, updated pretty often, around 100 pages of content. All on-site SEO is done as it should be. Am I just being impatient? Or should i start digging deeper?
Intermediate & Advanced SEO | | internetrepublic
What, on average, is the 'impact time' of decent links on your rankings in a semi-competitive market?? Thanks!0