Best practices for handling https content?
-
Hi Mozzers - I'm having an issue with https content on my site that I need help with.
Basically we have some pages that are meant to be secured, cart pages, auth pages, etc, and then we have the rest of the site that isn't secured. I need those pages to load correctly and independently of one another so that we are using both protocols correctly.
Problem is - when a secure page is rendered the resources behind it (scripts, etc) won't load with the unsecured paths that are in our master page files currently.
One solution would be to render the entire site in https only, however this really scares me from an SEO standpoint. I don't know if I want to put my eggs in that basket.
Another solution is to structure the site so that secure pages are built differently from unsecured pages, but that requires a bit of re-structuring and new SOPs to be put in place.
I guess my question is really about best practices when using https.
- How can I avoid duplication issues?
- When do I need to use rel=canonical?
- What is the best way to do things here to avoid heavy maintenance moving forward?
-
Thanks for the RE Cyrus. One of my architects and I came to a similar conclusion, but it's definitely good to hear it from another source in the SEO community on the development side of things.
We decided to implement a side-wide rel=canonical to the http URLs to avoid duplication issues, as well as ensure resources are using relative links.
I'm hoping this solves each issue with minimal impact!
-
Hi Cody,
First of all, Google generally doesn't have much trouble today with HTTPS content, and generally treats it and ranks just like anything else.
In fact, I'd say in a couple more years this may be the norm.
As for using rel canonical, you generally want to use it anytime there is a risk of duplicate content. In this case, the important thing is to use the full URL, and not relative URLs. So https://example.com. This should take care of 100% of your duplication issues.
I'm not an expert in https development (but I have a little experience) ithout diving too deep into how you serve your content, it's usually fine to serve file like javascript and images from both secure and non-secure paths. In this instance, you want to make sure your http files are calling relative file paths (as opposed to absolute) and make sure the content loads. 9 times out of 10 this works fine.
Hope this helps. Best of luck with your SEO!
-
Any more input here? Are there any issues with using a sitewide rel=canonical to avoid the duplication of our https URLs?
-
Thanks for the RE, but I'm not sure that answers my question. I'm looking for best practice information about how to build https content. The noindex tip is good. I'll do that. Just wondering how the back end should work to make sure I don't get "insecure content" warnings.
-
Don't go the whole site https route. You are just creating duplicate site nightmares.
Since you are working within a cart and auth pages you need to add a noindex nofollow meta tag on those pages to start with. This way they don't get into the index to start with, also any pages that are in the index now will be dropped. Do not use robots.txt for this, use the meta tag noindex nofollow.
You need to setup 301 redirects on all other pages from the https to the http version for all pages except the cart and auth pages (i.e those pages that are supposed to be https). If Google has found any of those pages that are supposed to be http, then the 301 will correct that, plus you get the user back to the right version of the page for bookmarking and other purposes.
I
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to leverage Fiverr for content marketing?
I am currently using Fiver for cleaning up grammar, building backlinks and summarizing topics and articles. Does anyone else have suggestions on other ways to leverage the Fiverr community or any other similar services?
Intermediate & Advanced SEO | | clickshift0 -
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
How to handle duplicate content with Bible verses
Have a friend that does a site with bible verses and different peoples thoughts or feelings on them. Since I'm an SEO he came to me with questions and duplicate content red flag popped up in my head. My clients all generate their own content so not familiar with this world. Since Bible verses appear all over the place, is there a way to address this from an SEO standpoint to avoid duplicate content issues? Thanks in advance.
Intermediate & Advanced SEO | | jeremyskillings0 -
Best anchor text strategy for embeddable content
Hi all We provide online services, and as part of this we provide our clients with a javascript embeddable 'widget' to place on their website. This is fairyly popular (100s-1000s of inserts on websites). The main workings of this are javascript (they spit html iframe onto the page) but we also include both a <noscript>portion (which is purely customer focused, it deep links into a relevant page on our website for the user to follow) and also a plain <p><a href=''></a></p> at the bottom, under the JS. This is all generated and inserted by the website owner. Therefore, after insertion we can dynamically update whatever the Javascript renders out, but the <noscript> and <a> at the bottom are there forever.</p> <p>Previously, this last plain link has been used for optimisation, with it randomly selecting 1 out of a bank of 3 different link anchor texts when the widget html is first generated.</p> <p>We've also recently split our website into B2B and B2C portions, so this will be linking to a newer domain with much established backlinks than the existing domain. I think we could get away with optimised keyword links on the old domain but the newer domain they will be more obvious.</p> <p>In light of recent G updates, we're afraid this may look spammy. We obviously want to utilise the link as best as possible, as it is used by hundreds of our clients, but don't want it to cause any issues. </p> <p>So my question, would you just focus on using brand name anchor text for this? Or could we mix it up with a few keyword optimised links also? If so, what sort of ratio would you suggest?</p> <p>Many thanks</p></noscript>
Intermediate & Advanced SEO | | benseb0 -
Best tools for identifying internal duplicate content
Hello again Mozzers! Other than the Moz tool, are there any other tools out there for identifying internal duplicate content? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Best practice for duplicate website content: same root domain name but different extension
Hi there I have a new client who has two websites: http://www.bayofislandsteambuilding.co.nz
Intermediate & Advanced SEO | | turnbullholdingsltd
http://www.bayofislandsteambuilding.org.nz They are the same in every regard apart from the domain extension (.co.nz & .org.nz) which is likely to be causing them issues with Google ranking given the huge amount of duplicate content. What is the best practice approach to fixing this? Normally, if I was starting from scratch, I would set one of the extensions as an alias which redirects to the main domain. Thanks in advance. Laurie0 -
E-commerce site, one product multiple categories best practice
Hi there, We have an e-commerce shopping site with over 8000 products and over 100 categories. Some sub categories belong to multiple categories - for example, A Christmas trees can be under "Gardening > Plants > Trees" and under "Gifts > Holidays > Christmas > Trees" The product itself (example: Scandinavian Xmas Tree) can naturally belong to both these categories as well. Naturally these two (or more) categories have different breadcrumbs, different navigation bars, etc. From an SEO point of view, to avoid duplicate content issues, I see the following options: Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking. Use the same URL and display only one "main" version of breadcrumbs and menus. Possibly add the other "not main" categories as links to the category / product page. Use a different URL based on where we came from and do nothing (will create essentially the same content on different urls except breadcrumbs and menus - there's a possibiliy to change the category text and page title as well) Use a different URL based on where we came from with different menus and breadcrumbs and use rel=canonical that points to the "main" category / product pages This is a very interesting issue and I would love to hear what you guys think as we are finalizing plans for a new website and would like to get the most out of it. Thank you all!
Intermediate & Advanced SEO | | arikbar0 -
XML Sitemaps for Message Boards / Forums - Best Practices?
I'm working with a message board that has been around for 10+ years and never taken SEO best practices into consideration. They recently started seeing mobile URLs show up in regular results, which they don't want. I'm recommending they implement multiple sitemaps to properly indicate to Google how to crawl the site and what to index. I've never dealt with a site this large so I'm not sure best practices. They have a HUGE community and new URLs are created every second. Doing a site: search returns "About 12,100,000" URLs. What are some best practices / the best way to approach sitemaps for a site of this size?
Intermediate & Advanced SEO | | MichaelWeisbaum0