Confused About Problems Regarding Adding an SSL
-
After reading Cyrus' article: http://moz.com/blog/seo-tips-https-ssl, I am now completely confused about what adding SSL could do to our site. Bluehost, our hosting provider, says if we get their SSL, they just add it to our site and it's up in a few hours: no problem whatsoever. If that's true, that'd be fantastic...however, if that's true, there wouldn't need to be like 10 things you're supposed to do (according to Cyrus' article) to ensure your rankings after the switch.
Can someone clarify this for me?
Thanks,
Ruben
-
Thanks Cyrus!
-
Hi Ruben,
Thanks for writing in. I'm unfamiliar with Bluehost's HTTPs service, but I assume they are taking care of top level issues. You'll still want to go through the checklist to make sure everything is valid and you follow SEO best practices.In short:
- Check your links
- Check your assets (images, CSS, javascript)
- Canonical tags
- Register with Google Webmaster Tools
- Update your sitemaps and robots.txt files
This covers the important stuff. As you noted, a few more tips here: http://moz.com/blog/seo-tips-https-ssl
-
Maybe was obvious to everybody but 301 redirect for every single page is also a fundamental step, otherwise you are going to have broken external links, not to mention WMT which I don't think would be satisfied by just the canonical update.
Sitemap must be updated as well.
We recently switched a website from HTTP to HTTPS and in term of performance there was no difference after the update, at least according to WMT and analytics.
I was kind of scared before to update but at the end everything was smoother than expected, WMT took around 10 days to completely re-index the https version.
But of course we kept finding some non https link embedded here and there in some pages for days and we had to manually edit some content to avoid ssl warning from browsers.
-
I have no idea what CMS you are using but check the server side code generating the link, not just the code sent to the browser.
We recently switched to SSL, and our CMS was already building internal links on pages using the protocol of the http request.
-
Thanks Highland!
-
Great, thanks!
-
Ruben, I had a look at your website and your URLs all have HTTP in them so these would need to be updated all across your site before you make the switch to HTTPS. Because you are using WordPress this should be as simple as updating the site URL to https://www.kempruge.com.
The tip by @Highland about using Firebug is excellent. This will allow you to quickly debug if there are non-HTTPS links remaining - in the WordPress theme or template, for example.
Have a look at the WordPress HTTPS documentation also.
-
Hi Alex,
I'm not really sure if we use a protocol-less linking pattern or not. I don't see http:// in any of our urls, so if that's the criteria I'm guessing we don't? I included a screenshot of one of our URLs. Would you mind telling me if it's clear from the image whether we do or do not?
Thanks for your response. I really appreciate your time and input.
Best,
Ruben
-
One major tip I always point people to is that using protocol-less links for anything external is a great way to make sure your site always supports SSL without issue.
Firebug is a great way to make sure everything is loading HTTPS. Turn it on, switch to the Net tab, and load your page. It will show you every request sent as part of your page. It makes spotting non-SSL requests easy.
You can turn HSTS on yourself if your provider uses Apache and supports htaccess. (sorry I can't link an article, Moz won't let me). If they don't, you will have to have your host enable it on their end.
-
Implementing SSL should be straightforward for the most part
You need to ensure that links around your site (including canonical links) are updated to use HTTPS (so https://example.com/link as opposed to http://example.com/link where example.com is your domain name). If you are already using a protocol-less linking pattern (//example.com/link) you don't need to update the links.
You can also configure your web server to only serve HTTPS. If your web server is Apache you can do this with the SSLRequireSSL directive.
<code><location>SSLRequireSSL</location></code>
HTTPS also causes a significant slow-down as the browser and the server negotiate a secure connection. If your site has already been optimized for speed it should not cause a problem but if in doubt revisit that process and ensure that you are getting the best possible speed for your visitors.
The article by Cyrus has a great checklist to double check everything.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regarding SEO Structured Data
1. Should we add organization schema on all pages of the website OR just homepage? 2. What is the best practice for catalog page schema as every website is following a different pattern?
Intermediate & Advanced SEO | | Rajesh.Prajapati1 -
Old product URLs still indexed and maybe causing problems?
Hi all, Need some expertise here: We recently (3 months ago) launched a newly updated site with the same domain. We also added an SSL and dropped the www (with proper redirects). We went from http://www.mysite.com to https://mysite.com. I joined the company about a week after launch of the new site. All pages I want indexed are indexed, on the sitemap and submitted (submitted in July but processes regularly). When I check site:mysite.com everything is there, but so are pages from the old site that are not on the sitemap. These do have 301 redirects. I am finding our non-product pages are ranking with no problem (including category pages) but our product pages are not, unless I type in the title almost exactly. We 301 redirected all old urls to new comparable product, or if the product is not available anymore to the home page. For better or worse, as it turns out and prior to my arrival, in building the new site the team copied much of the content (descriptions, reviews, etc) from the old site to create the new product pages. After some frustration and research I am finding the old pages are still indexed and possibly causing a duplicate content issue. Now, I gather there is supposedly no "penalty", per se, for duplicate content but a page or site will simply not show in the SERPs. Understandable and this seems to be the case. We also sell a lot of product wholesale and it turns out many dealers are using the same descriptions we have (and have had) on our site. Some are much larger than us so I'd expect to be pushed down a bit but we don't even show in the top 10 pages...for our own product. How long will it take for Google to drop the old and rank the new as unique? I have re-written some pages but much is technical specifications and tough to paraphrase or re-write. I know I could do this in Search Console but I don't have access to the old site any longer. Should I remove the 301s a few at a time and see if the old get dropped faster? Maybe just re-write ALL the content? Wait? As a site note, I'm also on a Drupal CMS with a Shopify ecommerce module so maybe the shop.mysite.com vs mysite.com is throwing it off with the products(?) - (again the Drupal non-product AND category pages rank fine). Thoughts on this would be much appreciated. Thx so much!
Intermediate & Advanced SEO | | mcampanaro0 -
Bingpreview/1.0b Useragent Using Adding Trailing Slash to all URLs
The Bingpreview crawler, which I think exists in order to take snapshots of mobile friendly pages, crawled my pages last night for the first time. However, it is adding a trailing slash to the end of each of my dynamic pages. The result is my program is giving the wrong page--my program is not expecting a trailing slash at the end of the urls. It was 160 pages, but I have thousands of pages it could do this to. I could try doing a mod rewrite but that seems like it should be unnecessary. ALL the other crawlers are crawling the proper urls. None of my hyperlinks have the slash on the end. I have written to Bing to tell them of the problem. Is anyone else having this issue? Any other suggestions for what to do? The user agent is: Mozilla/5.0 (iPhone; CPU iPhone OS 7_0 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A465 Safari/9537.53 BingPreview/1.0b
Intermediate & Advanced SEO | | friendoffood0 -
Adding a Canonical Tag to each page referencing itself?
Hey Mozers! I've noticed that on www.Zappos.com they have a Canonical tag on each page referencing it self. I have heard that this is a popular method but I dont see the point in canon tagging a page to its self. Any thoughts?
Intermediate & Advanced SEO | | rpaiva0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
OSE Confusion on 'External' Links
Hello All, I am still very new to this but am starting to get a grasp of things in the SEO world, but there are still a few things that I just don't get yet. For example, I've been trying to find out a great strategy for Link Building, what better way than looking at already existing SEO companies? So I did a quick search on a website (http://www.opensiteexplorer.org/links?site=www.springer-marketing.co.uk) and tried to look at all of the External incoming links. So I did a filter of Followed+301, Only External and all subdomains. But about 20 of the links for this site are coming from itself. Now, i'm not an expert, but presumably you can't just give yourself strong links? Is this some kind of trick, how or why would somebody do this? Mind Blows Paul
Intermediate & Advanced SEO | | Paul_Tovey0 -
Affiliate Links Added and Site Dropped in only Google
My site was dropshipping a product and we switched to an affiliate offer. We had three 4 links to different affiliate products. Our site dropped the next day. I have been number 1 for 6 months, has a pr 6 and is 2 years old. It has been 2 weeks and the site hasn't jumped back. Any suggestions on how to handle this?
Intermediate & Advanced SEO | | dkash0 -
Any experience regarding what % is considered duplicate?
Some sites (including 1 or two I work with) have a legitimate reason to have duplicate content, such as product descriptions. One way to deal with duplicate content is to add other unique content to the page. It would be helpful to have guidelines regarding what percentage of the content on a page should be unique. For example, if you have a page with 1,000 words of duplicate content, how many words of unique content should you add for the page to be considered OK? I realize that a) Google will never reveal this and b) it probably varies a fair bit based on the particular website. However... Does anyone have any experience in this area? (Example: You added 300 words of unique content to all 250 pages on your site, that each had 100 words of duplicate content before, and that worked to improve your rankings.) Any input would be appreciated! Note: Just to be clear, I am NOT talking about "spinning" duplicate content to make it "unique". I am talking about adding unique content to a page that has legitimate duplicate content.
Intermediate & Advanced SEO | | AdamThompson0