Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Redirect to http to https - Pros and Cons
-
Hi,
I know its best practice to redirect a website from http to https, instead of having many entry point to your website. When a website has been running for a long time on http and https, what are the SEO Pros and Cons of implementing a redirect from Http to Https?
-
Do you know how long it takes Google to drop pages from Google's index/cache?
-
(1) no, if you link to an insecure page it counts against you. Since a user or search engine would have to load and visit the insecure content to find the canonical (as that's where it would be), it does not mitigate this. You'll just have to hope it doesn't end up happening too much. Canonical tags only stop content duplication, they have no impact on SEO authority merging or insecure links
(2) If the HTTPS URLs are pretty much exactly the same as their HTTP counterparts and you 301 HTTP to HTTPS, the SEO authority should flow across to HTTPS instead. Canonical tags are not proven to do what 301s do, so you may end up in a mess with those. Most sites experience a slight dip moving from HTTP to HTTPS via proper 301s, however it's not large and doesn't last long if the 301s were done well. Staying on HTTP in the long term, you will lose a lot of rankings (gradually, over time). Since you will be constantly losing, it puts your site's progress 'on hold', so the small dip from moving from HTTP to HTTPS is the 'lesser of two evils' (IMO)
(3) Both. It will reduce the number of times Google crawls HTTP, but only after pages on HTTP are dropped from Google's index / recent cache
-
Thanks for the answer. However, have two more questions: (1) Will implementing canonical tags limit the temporary disruption and (2) If backlinks are pointing to http will these be lost or transferred, i.e. will https pages have less equity or inherit equity of the http pages. Finally, will redirecting to https reduce the number of times Google crawls your site or will google still crawl http until all http pages in the Google cache are removed?
-
Or in NginX format which is usually faster
-
There are no cons that I can think of, a simple script in a sites htaccess file is the best was to implement the redirection.
-
The idea of HTTPS has always been a good one, and most leading businesses implemented it a long time ago.
However, somewhat recently, Google announced that HTTPS is a ranking factor.
Obviously, that got SEOs talking about and debating the subject.
At the time, it was a very small ranking factor, affecting less than 1% of global searches. Even now, it’s not a big factor.
However, security is something that Google takes very seriously, and it’s likely to become more important in the future.
Some SEOs jumped right on it and made the switch.
-
This is a very solid answer. One additional point is that without a forced structure, Google can 'catch out' your secure site linking to your insecure site. Say you have a blog and a post in the blog links to one of your pages, that link is probably created as 'absolute' in your CMS. So suddenly, when you load that blog post on HTTPS, you can see a link pointing to HTTP. Google doesn't like links pointing to insecure content, so over time the situation snowballs and you lose a lot of trust
-
If your current pages can be accessed by http and by https, and if you don't have canonicals or redirects pointing everything to one version or the other, then one very significant "con" for that approach is that you are splitting your link equity. So, if the http page has 50 inbound links, and the https has another 50, you would do better to have one page with 100 inbound links.
Another difference is how browsers show/warn about non-secure pages. As well as any ranking factor they may associate with secure. Again, in favor of redirecting http to https. The visual handling can also impact conversion rates and bounce rates, which can in turn impact ranking.
As far as cons to redirecting, one would be that you might expect a temporary disruption to rankings. There will likely be a bit of a dip, short term. Another is that you will need to remove and then be careful about accidentally adding any non-secure resources (like images) on the https pages, which will then issue a warning to visitors as well as possibly impacting ranks. There is some consensus that redirects (and canonical links) do leak a very small amount of link equity for each hop they take. So, that's another "con". But my recent experiences doing this with two sites have been that after the temporary "dip" of a couple of months, if done properly, the "pros" outweigh the "cons".
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
DNS vs IIS redirection
I'm working on a project where a site has gone through a rebrand and is therefore also moving to a new domain name. Some pages have been merged on the new site so it's not a lift and shift job and so I'm writing up a redirect plan. Their IT dept have asked if we want redirects done by DNS redirect or IIS redirect. Which one will allow us to have redirects on a page level and not a domain level? I think IIS may be the right route but would love your thoughts on this please.
Technical SEO | | Marketing_Today1 -
Http to https - is a '302 object moved' redirect losing me link juice?
Hi guys, I'm looking at a new site that's completely under https - when I look at the http variant it redirects to the https site with "302 object moved" within the code. I got this by loading the http and https variants into webmaster tools as separate sites, and then doing a 'fetch as google' across both. There is some traffic coming through the http option, and as people start linking to the new site I'm worried they'll link to the http variant, and the 302 redirect to the https site losing me ranking juice from that link. Is this a correct scenario, and if so, should I prioritise moving the 302 to a 301? Cheers, Jez
Technical SEO | | jez0000 -
What to do with 302 redirects being indexed
Hi there, Our site's forums include permalinks that for some reason uses an intermediary URL that 302 redirects to the URL with the permalink anchor. For example: http://en.tradimo.com/learn/chart-analysis/time-frames/ In the comments, there is a permalink to the following URL; en.tradimo.com/co/50c450005f2b949e3200001b/ (there is no content here, and never has been). This URL 302 redirects to the following final URL: http://en.tradimo.com/learn/chart-analysis/time-frames/?offset=0&limit=20#50c450005f2b949e3200001b The problem is, Google is indexing the redirect URL (en.tradimo.com/co/50c450005f2b949e3200001b/) and showing duplicate content even though we are using the nofollow tag on these links. Ideally, we would directly use the last link rather than redirecting. Alternatively, I'd say a 301 redirect would be preferable. But if both aren't available, is there a way to get these pages out of the index? Is the canonical tag the best way? I really wish I could just add /co/ to the robots.txt file, but I think they would still be in the index, right? Thanks for your help!
Technical SEO | | etruvian0 -
Duplicate content and http and https
Within my Moz crawl report, I have a ton of duplicate content caused by identical pages due to identical pages of http and https URL's. For example: http://www.bigcompany.com/accomodations https://www.bigcompany.com/accomodations The strange thing is that 99% of these URL's are not sensitive in nature and do not require any security features. No credit card information, booking, or carts. The web developer cannot explain where these extra URL's came from or provide any further information. Advice or suggestions are welcome! How do I solve this issue? THANKS MOZZERS
Technical SEO | | hawkvt10 -
Do search engines treat 307 redirects differently from 302 redirects?
We will need to send our users to an alternate version of our homepage for a few hours for a certain event. The SEO task at hand is to minimize the chance of the special homepage getting crawled and cached in the search engines in place of our normal homepage. (This has happened in the past so the concern is not imaginary.) Among other options, 302 and 307 redirects are being discussed. IE, redirecting www.domain.com to www.domain.com/specialpage. Having used 302s and 301s in the past, I am well aware of how search engines treat them. A 302 effectively says "Hey, Google! Please get rid of the old content on www.domain.com and replace it with the content on /specialpage!" Which is exactly what we don't want. My question is: do the search engines handle 307s any differently? I am hearing that the 307 does NOT result in the content of the second page being cached with the first URL. But I don't see that in the definition below (from w3.org). Then again, why differentiate it from the 302? 307 Temporary Redirect The requested resource resides temporarily under a different URI. Since the redirection MAY be altered on occasion, the client SHOULD continue to use the Request-URI for future requests. This response is only cacheable if indicated by a Cache-Control or Expires header field. The temporary URI SHOULD be given by the Location field in the response. Unless the request method was HEAD, the entity of the response SHOULD contain a short hypertext note with a hyperlink to the new URI(s) , since many pre-HTTP/1.1 user agents do not understand the 307 status. Therefore, the note SHOULD contain the information necessary for a user to repeat the original request on the new URI. If the 307 status code is received in response to a request other than GET or HEAD, the user agent MUST NOT automatically redirect the request unless it can be confirmed by the user, since this might change the conditions under which the request was issued.
Technical SEO | | CarsProduction0 -
How do you disallow HTTPS?
I currently have a site (startuploans.org) that runs everything as http, recently we decided to start an online application to process loan apps. Now, for one certain section we configured ssl to work (https://www.startuploans.org/secure/). If I go to the HTTPS url for any of my other pages they show up...I was going to just 301 everything from https but because it is in a subdirectiory I can't... Also, canonical URL's won't work either because it's a totally different system and the pages are generated in an odd manor. It's really just 1 page that needs to be disallowed.. Is there any way to disallow all HTTPS requests from robots.txt while keeping all the HTTP requests working as normal?
Technical SEO | | WebsiteConsultants0