Best practices for handling https content?
-
Hi Mozzers - I'm having an issue with https content on my site that I need help with.
Basically we have some pages that are meant to be secured, cart pages, auth pages, etc, and then we have the rest of the site that isn't secured. I need those pages to load correctly and independently of one another so that we are using both protocols correctly.
Problem is - when a secure page is rendered the resources behind it (scripts, etc) won't load with the unsecured paths that are in our master page files currently.
One solution would be to render the entire site in https only, however this really scares me from an SEO standpoint. I don't know if I want to put my eggs in that basket.
Another solution is to structure the site so that secure pages are built differently from unsecured pages, but that requires a bit of re-structuring and new SOPs to be put in place.
I guess my question is really about best practices when using https.
- How can I avoid duplication issues?
- When do I need to use rel=canonical?
- What is the best way to do things here to avoid heavy maintenance moving forward?
-
Thanks for the RE Cyrus. One of my architects and I came to a similar conclusion, but it's definitely good to hear it from another source in the SEO community on the development side of things.
We decided to implement a side-wide rel=canonical to the http URLs to avoid duplication issues, as well as ensure resources are using relative links.
I'm hoping this solves each issue with minimal impact!
-
Hi Cody,
First of all, Google generally doesn't have much trouble today with HTTPS content, and generally treats it and ranks just like anything else.
In fact, I'd say in a couple more years this may be the norm.
As for using rel canonical, you generally want to use it anytime there is a risk of duplicate content. In this case, the important thing is to use the full URL, and not relative URLs. So https://example.com. This should take care of 100% of your duplication issues.
I'm not an expert in https development (but I have a little experience) ithout diving too deep into how you serve your content, it's usually fine to serve file like javascript and images from both secure and non-secure paths. In this instance, you want to make sure your http files are calling relative file paths (as opposed to absolute) and make sure the content loads. 9 times out of 10 this works fine.
Hope this helps. Best of luck with your SEO!
-
Any more input here? Are there any issues with using a sitewide rel=canonical to avoid the duplication of our https URLs?
-
Thanks for the RE, but I'm not sure that answers my question. I'm looking for best practice information about how to build https content. The noindex tip is good. I'll do that. Just wondering how the back end should work to make sure I don't get "insecure content" warnings.
-
Don't go the whole site https route. You are just creating duplicate site nightmares.
Since you are working within a cart and auth pages you need to add a noindex nofollow meta tag on those pages to start with. This way they don't get into the index to start with, also any pages that are in the index now will be dropped. Do not use robots.txt for this, use the meta tag noindex nofollow.
You need to setup 301 redirects on all other pages from the https to the http version for all pages except the cart and auth pages (i.e those pages that are supposed to be https). If Google has found any of those pages that are supposed to be http, then the 301 will correct that, plus you get the user back to the right version of the page for bookmarking and other purposes.
I
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTP vs HTTPS duplication where HTTPS is non-existing
Hey Guys, **My site is **http://www.citymetrocarpetcleaning.com.au/ Goal: I am checking if there is an HTTPS version of my site (duplication issue) What I did: 1. I went to Screaming Frog and run https://www.citymetrocarpetcleaning.com.au/. The result is that it is 200 OK (the HTTPS version exists - possible duplication) 2. Next, I opened a browser and manually replace HTTP with HTTPS, the result is "Image 1" which doesn't indicate a duplication. But if we go deeper in Advanced > Proceed to www.citymetrocarpetcleaning.com.au (unsafe) "Image 2", it displays the content (Image 3). Question: 1. Is there an HTTP vs HTTPs duplication here? 2. Do I need to implement 301 redirection/canonical tags on HTTPS pointing to HTTP to solve duplication? Please help! Cheers! uIgJv DsNrA El7aI
Intermediate & Advanced SEO | | gamajunova0 -
How would you handle these pages? Should they be indexed?
If a site has about 100 pages offering specific discounts for employees at various companies, for example... mysite.com/discounts/target mysite.com/discounts/kohls mysite.com/discounts/jcpenney and all these pages are nearly 100% duplicates, how would you handle them? My recommendation to my client was to use noindex, follow. These pages tend to receive backlinks from the actual companies receiving the discounts, so obviously they are valuable from a linking standpoint. But say the content is nearly identical between each page; should they be indexed? Is there any value for someone at Kohl's, for example, to be able to find this landing page in the search results? Here is a live example of what I am talking about: https://www.google.com/search?num=100&safe=active&rlz=1C1WPZB_enUS735US735&q=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&oq=site%3Ahttps%3A%2F%2Fpoi8.petinsurance.com%2Fbenefits%2F&gs_l=serp.3...7812.8453.0.8643.6.6.0.0.0.0.174.646.3j3.6.0....0...1c.1.64.serp..0.5.586...0j35i39k1j0i131k1j0i67k1j0i131i67k1j0i131i46k1j46i131k1j0i20k1j0i10i3k1.RyIhsU0Yz4E
Intermediate & Advanced SEO | | FPD_NYC0 -
What are the best practices with website redesign & redirects?
I have a website that is not very pretty but has great rankings. I want to redesign the website and loose as little rankings as possible and still clean up the navigation. What are the best practices? Thanks in advance.
Intermediate & Advanced SEO | | JHSpecialty0 -
Redirect HTTP to HTTPS
Hello, Simple question - Should we be redirecting our HTTP pages to HTTPS? If yes, why, if not, why? Thanks!
Intermediate & Advanced SEO | | HB170 -
Best anchor text strategy for embeddable content
Hi all We provide online services, and as part of this we provide our clients with a javascript embeddable 'widget' to place on their website. This is fairyly popular (100s-1000s of inserts on websites). The main workings of this are javascript (they spit html iframe onto the page) but we also include both a <noscript>portion (which is purely customer focused, it deep links into a relevant page on our website for the user to follow) and also a plain <p><a href=''></a></p> at the bottom, under the JS. This is all generated and inserted by the website owner. Therefore, after insertion we can dynamically update whatever the Javascript renders out, but the <noscript> and <a> at the bottom are there forever.</p> <p>Previously, this last plain link has been used for optimisation, with it randomly selecting 1 out of a bank of 3 different link anchor texts when the widget html is first generated.</p> <p>We've also recently split our website into B2B and B2C portions, so this will be linking to a newer domain with much established backlinks than the existing domain. I think we could get away with optimised keyword links on the old domain but the newer domain they will be more obvious.</p> <p>In light of recent G updates, we're afraid this may look spammy. We obviously want to utilise the link as best as possible, as it is used by hundreds of our clients, but don't want it to cause any issues. </p> <p>So my question, would you just focus on using brand name anchor text for this? Or could we mix it up with a few keyword optimised links also? If so, what sort of ratio would you suggest?</p> <p>Many thanks</p></noscript>
Intermediate & Advanced SEO | | benseb0 -
Best Practice for setting up expert author contributing to Multiple Sites?
If a single author contributes to multiple sites, should each site have its own author page (tying to the same single gg+ account)? Ex. One author > one gg+ account > multiple author pages (one per site) Or, should all sites publishing his content link to a single author page/bio on a single, main site? Ex. One author > one gg+ account > a single author page on one site (all other sites link to this author page) In this event, where would the 'contributor to' link point for the additional sites he is contributing to, the homepage? Thanks!
Intermediate & Advanced SEO | | seagreen0 -
How to Best Establish Ownership when Content is Duplicated?
A client (Website A) has allowed one of their franchisees to use some of the content from their site on the franchisee site (Website B). This franchisee lifted the content word for word, so - my question is how to best establish that Website A is the original author? Since there is a business relationship between the two sites, I'm thinking of requiring Website B to add a rel=canonical tag to each page using the duplicated content and referencing the original URL on site A. Will that work, or is there a better solution? This content is primarily informational product content (not blog posts or articles), so I'm thinking rel=author may not be appropriate.
Intermediate & Advanced SEO | | Allie_Williams0 -
How to manage duplicate content?
I have a real estate site that contains a large amount of duplicate content. The site contains listings that appear both on my clients website and on my competitors websites(who have better domain authority). It is critical that the content is there because buyers need to be able to find these listings to make enquiries. The result is that I have a large number pages that contain duplicate content in some way, shape or form. My search results pages are really the most important ones because these are the ones targeting my keywords. I can differentiate these to some degree but the actual listings themselves are duplicate. What strategies exist to ensure that I'm not suffereing as a result of this content? Should I : Make the duplicate content noindex. Yes my results pages will have some degree of duplicate content but each result only displays a 200 character summary of the advert text so not sure if that counts. Would reducing the amount of visible duplicate content improve my rankings as a whole? Link back to the clients site to indicate that they are the original source Any suggestions?
Intermediate & Advanced SEO | | Mulith0