Transfering Site from Http to HTTPS
-
Migrating all of our pages from HTTP to HTTPS. I am listing few of my concerns regarding the same:
- Currently, all HTTPS traffic to our Homepage and SEO page is 301 Redirected to HTTP equivalent. So, when we enable HTTPS on all our pages and 301 all HTTP traffic to HTTPS and stop current 301 Redirection to HTTP, will it still cause a loop during Google crawl due to old indexing?
- Will we move whole SEO facing site to HTTPS at once or will it be in phases? Which of the two approach is better keeping SEO in mind?
- what all SEO changes will be required on all pages.(eg. Canonical URLs on our website as well as affiliate websites), sitemaps etc.
-
Where the issue becomes more complex, is if you add CloudFlare to your site for DDoS protection, to serve up from closer towers, etc. Many sites who want to load faster or avoid attack are doing so. We have thus incurred a redirect issue to work through and it takes some technical expertise. Because CloudFlare is a third party for SSL certificates in many hosting packages, this seems to catch many off-guard and wondering what went wrong.
If helpful, we have had greater success buying our own CloudFlare service versus letting a hosting provider be an affiliate service for our SSL.
-
Hi Robin,
I just went through this and here is my $0.02...
1. You will need to basically reverse your 301 redirect, so that http redirects to https, instead of how you have it set now. This should be no problem, just update .htaccess and reverse it. If you are using Wordpress, you'll need to set your site URL in the general settings tab and go verify all the permalinks update to https. You won't get a redirect chain unless you have http pointing to https and then https pointing back to http. Just use one 301, and you should be fine.
2. I would move the whole thing at once. It'll be a lot simpler than trying to segregate which pages are http and which are https.
3. You won't need canonicals because of the 301. Google will see that you are permanently redirecting all your pages to https. Another likely canonical situation; if you have another website using an article that you published first, and they are pointing to the same article on your website as the canonical, it'll hit the redirect and not split the link juice.
4. The tracking URL's won't change, and you can verify this by logging in to Analytics and Search Console. The one thing you'll need to do in Search Console is "add a property" for the https:// and the https://www versions of your website. You'll then need to make sure you have the preferred domain selected. Then you can resubmit your sitemap.xml files and you should be good to go.
I hope this helps
-
Hey.
No, it's not a new property. No need in my opinion. Let's see if others see otherwise, but I never do this.
For the first - It won't cause a loop. Just make sure to set all links to https. If you run WP - install a plugin called something like "simply http to https" if you don't have the coding skills or a dev team to handle the migration properly. It's a clean solution.
Cheers
-
Can u answer me little more deeply and technically about the first point with I had asked. Since I'm not having a clear idea with it.
Also in google analytics, should I add the new https url under Admin>Property Settings?
-
Hey there,
1. Google won't be affected, no worries.
2. I'd recommend moving the whole thing.
3. All you need to do is make sure that all internal links (from / toold posts/categories/etc) will now be set to https so they don't redirect from http to https.
4. GA code remains
5. No. Just tell G to reindex when you're ready.
Good luck!
-
Also please explain what all things to do with Google Analytics and Google Webmasters.
1. Google Analytics Tracking code will remains the same?
2. Need to submit the new https version URL on google webmasters?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Forced Redirects/HTTP<>HTTPS 301 Question
Hi All, Sorry for what's about to be a long-ish question, but tl;dr: Has anyone else had experience with a 301 redirect at the server level between HTTP and HTTPS versions of a site in order to maintain accurate social media share counts? This is new to me and I'm wondering how common it is. I'm having issues with this forced redirect between HTTP/HTTPS as outlined below and am struggling to find any information that will help me to troubleshoot this or better understand the situation. If anyone has any recommendations for things to try or sources to read up on, I'd appreciate it. I'm especially concerned about any issues that this may be causing at the SEO level and the known-unknowns. A magazine I work for recently relaunched after switching platforms from Atavist to Newspack (which is run via WordPress). Since then, we've been having some issues with 301s, but they relate to new stories that are native to our new platform/CMS and have had zero URL changes. We've always used HTTPS. Basically, the preview for any post we make linking to the new site, including these new (non-migrated pages) on Facebook previews as a 301 in the title and with no image. This also overrides the social media metadata we set through Yoast Premium. I ran some of the links through the Facebook debugger and it appears that Facebook is reading these links to our site (using https) as redirects to http that then redirect to https. I was told by our tech support person on Newspack's team that this is intentional, so that Facebook will maintain accurate share counts versus separate share counts for http/https, however this forced redirect seems to be failing if we can't post our links with any metadata. (The only way to reliably fix is by adding a query parameter to each URL which, obviously, still gives us inaccurate share counts.) This is the first time I've encountered this intentional redirect thing and I've asked a few times for more information about how it's set up just for my own edification, but all I can get is that it’s something managed at the server level and is designed to prevent separate share counts for HTTP and HTTPS. Has anyone encountered this method before, and can anyone either explain it to me or point me in the direction of a resource where I can learn more about how it's configured as well as the pros and cons? I'm especially concerned about our SEO with this and how this may impact the way search engines read our site. So far, nothing's come up on scans, but I'd like to stay one step ahead of this. Thanks in advance!
Technical SEO | | ogiovetti0 -
Can anyone tell me why some of the top referrers to my site are porn site?
We noticed today that 4 of the top referring sites are actually porn sites. Does anyone know what that is all about? Thanks!
Technical SEO | | thinkcreativegroup1 -
Multilingual blogs and site structure
Hi everyone, I have a question about multilingual blogs and site structure. Right now, we have the typical subfolder localization structure. ex: domain.com/page (english site) domain.com/ja/page (japanese site) However, the blog is a slightly more complicated. We'd like to have english posts available in other languages (as many of our users are bilinguals). The current structure suggests we use a typical domain.com/blog or domain.com/ja/blog format, but we have issues if a Japanese (logged in) user wants to view an English page. domain.com/blog/article would redirect them to domain.com/ja/blog/article thus 404-ing the user if the post doesn't exist in the alternate language. One suggestion (that I have seen on sites such as etsy/spotify is to add a /en/ to the blog area: ex domain.com/en/blog domain.com/ja/blog Would this be the correct way to avoid this issue? I know we could technically work around the 404 issue, but I don't want to create duplicate posts in /ja/ that are in English or visa versa. Would it affect the rest of the site if we use a /en/ subfolder just for the blog? Another option is to use: domain.com/blog/en domain.com/blog/ja but I'm not sure if this alternative is better. Any help would be appreciated!
Technical SEO | | Seiyav0 -
What can we do to improve our site
Hi. I am hoping that some of you can help me with the in2town site www.in2town.co.uk The site is a news/lifestyle magazine site. The site is a cross between, huffington post, digital spy, female first and the sun newspaper. Basically the site is a news site as well as covering showbiz news, travel news, health news and advice etc What i would like is for people to look at the site and let me know what they feel i should do to improve the site to make it better for our readers and to gain more readership. I would also like to hear from people on how they find moving around the site as well as the speed of the site. At the moment the site is with an american hosting company and i am in the process of talking to UK hosting companies to move the site. The site is currently on a dedicated server. It would mean a lot if people could give me their advice on how to improve the site and make it a beter experience for our readers while at the same time being able to generate income with the site. Just a quick note, all content is original and we have a number of people who write for the site. many thanks
Technical SEO | | ClaireH-1848860 -
How to find all the links to my site
hi i have been trying to find all the links that i have to my site http://www.clairehegarty.co.uk but i am not having any luck. I have used the open explorer but it is not showing all the links but when i go to my google webmaster page it shows me more pages than it does on the semoz tool. can anyone help me sort this out and find out exactly what links are going into my site many thanks
Technical SEO | | ClaireH-1848860 -
Site maintenance and crawling
Hey all, Rarely, but sometimes we require to take down our site for server maintenance, upgrades or various other system/network reasons. More often than not these downtimes are avoidable and we can redirect or eliminate the client side downtime. We have a 'down for maintenance - be back soon' page that is client facing. ANd outages are often no more than an hour tops. My question is, if the site is crawled by Bing/Google at the time of site being down, what is the best way of ensuring the indexed links are not refreshed with this maintenance content? (ie: this is what the pages look like now, so this is what the SE will index). I was thinking that add a no crawl to the robots.txt for the period of downtime and remove it once back up, but will this potentially affect results as well?
Technical SEO | | Daylan1 -
Robots.txt blocking site or not?
Here is the robots.txt from a client site. Am I reading this right --
Technical SEO | | 540SEO
that the robots.txt is saying to ignore the entire site, but the
#'s are saying to ignore the robots.txt command? See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file To ban all spiders from the entire site uncomment the next two lines: User-Agent: * Disallow: /0 -
Site revision
our site has complete redesign including site architecture, page url and page content (except domain). It looks like a new site. The old site has been indexed about thirty thousand results by google. now what should i do first?
Technical SEO | | jallenyang0