Confused About Problems Regarding Adding an SSL
-
After reading Cyrus' article: http://moz.com/blog/seo-tips-https-ssl, I am now completely confused about what adding SSL could do to our site. Bluehost, our hosting provider, says if we get their SSL, they just add it to our site and it's up in a few hours: no problem whatsoever. If that's true, that'd be fantastic...however, if that's true, there wouldn't need to be like 10 things you're supposed to do (according to Cyrus' article) to ensure your rankings after the switch.
Can someone clarify this for me?
Thanks,
Ruben
-
Thanks Cyrus!
-
Hi Ruben,
Thanks for writing in. I'm unfamiliar with Bluehost's HTTPs service, but I assume they are taking care of top level issues. You'll still want to go through the checklist to make sure everything is valid and you follow SEO best practices.In short:
- Check your links
- Check your assets (images, CSS, javascript)
- Canonical tags
- Register with Google Webmaster Tools
- Update your sitemaps and robots.txt files
This covers the important stuff. As you noted, a few more tips here: http://moz.com/blog/seo-tips-https-ssl
-
Maybe was obvious to everybody but 301 redirect for every single page is also a fundamental step, otherwise you are going to have broken external links, not to mention WMT which I don't think would be satisfied by just the canonical update.
Sitemap must be updated as well.
We recently switched a website from HTTP to HTTPS and in term of performance there was no difference after the update, at least according to WMT and analytics.
I was kind of scared before to update but at the end everything was smoother than expected, WMT took around 10 days to completely re-index the https version.
But of course we kept finding some non https link embedded here and there in some pages for days and we had to manually edit some content to avoid ssl warning from browsers.
-
I have no idea what CMS you are using but check the server side code generating the link, not just the code sent to the browser.
We recently switched to SSL, and our CMS was already building internal links on pages using the protocol of the http request.
-
Thanks Highland!
-
Great, thanks!
-
Ruben, I had a look at your website and your URLs all have HTTP in them so these would need to be updated all across your site before you make the switch to HTTPS. Because you are using WordPress this should be as simple as updating the site URL to https://www.kempruge.com.
The tip by @Highland about using Firebug is excellent. This will allow you to quickly debug if there are non-HTTPS links remaining - in the WordPress theme or template, for example.
Have a look at the WordPress HTTPS documentation also.
-
Hi Alex,
I'm not really sure if we use a protocol-less linking pattern or not. I don't see http:// in any of our urls, so if that's the criteria I'm guessing we don't? I included a screenshot of one of our URLs. Would you mind telling me if it's clear from the image whether we do or do not?
Thanks for your response. I really appreciate your time and input.
Best,
Ruben
-
One major tip I always point people to is that using protocol-less links for anything external is a great way to make sure your site always supports SSL without issue.
Firebug is a great way to make sure everything is loading HTTPS. Turn it on, switch to the Net tab, and load your page. It will show you every request sent as part of your page. It makes spotting non-SSL requests easy.
You can turn HSTS on yourself if your provider uses Apache and supports htaccess. (sorry I can't link an article, Moz won't let me). If they don't, you will have to have your host enable it on their end.
-
Implementing SSL should be straightforward for the most part
You need to ensure that links around your site (including canonical links) are updated to use HTTPS (so https://example.com/link as opposed to http://example.com/link where example.com is your domain name). If you are already using a protocol-less linking pattern (//example.com/link) you don't need to update the links.
You can also configure your web server to only serve HTTPS. If your web server is Apache you can do this with the SSLRequireSSL directive.
<code><location>SSLRequireSSL</location></code>
HTTPS also causes a significant slow-down as the browser and the server negotiate a secure connection. If your site has already been optimized for speed it should not cause a problem but if in doubt revisit that process and ensure that you are getting the best possible speed for your visitors.
The article by Cyrus has a great checklist to double check everything.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Adding Canonical Tags in WYSIWYG Section of Subscription Based Sites
Our company has a paid subscription-based site that only allows us to add HTML in the WYSIWYG section, not in the backend of each individual page. Because we are an e-commerce site, we have many duplicate page issues. Is there a way for us to add or hide the canonical code in the WYSIWYG section instead of us having to make all of our pages significantly different?
Intermediate & Advanced SEO | | expobranders0 -
Index Problem
Hi guys I have a critical problem with google crawler. Its my website : https://1stquest.com I can't create sitemap with online site map creator tools such as XML-simemap.org Fetch as google tools usually mark as partial MOZ crawler test found both HTTP and HTTPS version on site! and google cant index several pages on site. Is problem regards to "unsafe URL"? or something else?
Intermediate & Advanced SEO | | Okesta0 -
Is it a problem that Google's index shows paginated page urls, even with canonical tags in place?
Since Google shows more pages indexed than makes sense, I used Google's API and some other means to get everything Google has in its index for a site I'm working on. The results bring up a couple of oddities. It shows a lot of urls to the same page, but with different tracking code.The url with tracking code always follows a question mark and could look like: http://www.MozExampleURL.com?tracking-example http://www.MozExampleURL.com?another-tracking-examle http://www.MozExampleURL.com?tracking-example-3 etc So, the only thing that distinguishes one url from the next is a tracking url. On these pages, canonical tags are in place as: <link rel="canonical<a class="attribute-value">l</a>" href="http://www.MozExampleURL.com" /> So, why does the index have urls that are only different in terms of tracking urls? I would think it would ignore everything, starting with the question mark. The index also shows paginated pages. I would think it should show the one canonical url and leave it at that. Is this a problem about which something should be done? Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Weird 404 URL Problem - domain name being placed at end of urls
Hey there. For some reason when doing crawl tests I'm finding pages with the domain name being tacked on the end and causing 404 errors.
Intermediate & Advanced SEO | | Jay328
For example: http://domainname.com/page-name/http://domainname.com This is happening to all pages, posts and even category type 1. Site is in Wordpress
2. Using Yoast SEO plugin Any suggestions? Thanks!0 -
Having problems resolving duplicate meta descriptions
Recently, I’ve recommended to the team running one of our websites that we remove duplicate meta descriptions. The site currently has a large number of these and we’d like to conform to SEO best practice. I’ve seen Matt Cutt’s recent video entitled, ‘Is it necessary for every page to have a meta description’, where he suggests that webmasters use meta descriptions for their most tactically important pages, but that it is better to have no meta description than duplicates. The website currently has one meta description that is duplicated across the entire site. This seemed like a relatively straight forward suggestion but it is proving much more challenging to implement over a large website. The site’s developer has tried to resolve the meta descriptions, but says that the current meta description is a site wide value. It is possible to create 18 distinct replacements for 18 ‘template’ pages, but any sub-pages of these will inherit the value and create more duplicates. Would it be better to: Have no meta descriptions at all across the site? Stick with the status quo and have one meta description site-wide? Make 18 separate meta descriptions for the 18 most important pages, but still have 18 sets of duplicates across the sub-pages of the site. Or…is there a solution to this problem which would allow us to follow the best practice in Matt’s video? Any help would be much appreciated!
Intermediate & Advanced SEO | | RG_SEO0 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
Did adding product videos cause my products to lose #1 position?
I work on an e-commerce site and for many of the products we sell, we rank #1 for "product name + item number" related searches. We decided to add product videos to some of our products in the hopes of getting an additional listing in the SERP's (regular listing + video listing in universal video results) Instead.. What we've noticed is that sometimes we are not getting 2 listings but just a regular listing with a video thumbnail that ranks somewhere on the middle of the first page. The video thumbnail is great.. but I'd rather the #1 position. I don't think Google likes to show video results as the #1 position for obvious product searches. What do you think? Did we lose our #1 position because of adding the videos to our product pages? Any advice or similar experiences? ~~ Additional information: On some of those queries, Google had decided to ignore our video and we have maintained our #1 ranking. Thanks!
Intermediate & Advanced SEO | | WebstaurantStore.com0 -
CNAME instead of A-record: seo problem?
Our supplier of a hosted CMS is hosting a few hundred website on his server. For every domainname pointing to a website we need to use the same IP-adres in the DNS A-record. They are now asking us to delete de A-record for all the websites and add a CNAME record. So they can send the traffic via a company like Versign in case of a DDOS attack. A lot of the websites rank well. Will there be a SEO problem when we start using the CNAME's instead of the A-records? Thanks, Olaf
Intermediate & Advanced SEO | | Olaf1