Does having a page (or site) available on HTTP and HTTPS cause duplication issues?
-
Say I've got a site that can be accessed using either protocal (i.e. HTTP and HTTPS), but most (if not all of the links) are pointing to the HTTP versions. Will it cause a problem if I start link building to HTTPS versions?
In other words does google see http://mysite.com as the same page as https://mysite.com?
Thanks
-
Got it - saw your other question. This is definitely a bit tricky.
-
Thanks for answering Dr. Pete
The reason I want to build links to the https version is because I plan to redirect the non secure homepage to the secure one.
The home page captures some customer details, so I don't think it makes sense to have the non secure version available as well.
If that makes sense.
-
Present day, Google has no issue with crawling/indexing secure pages, but Naghirniac is right - they will be seen as duplicates. The canonical tag is probably your easiest solution.
The best canonical solution, though, is to use the same URL consistently. Out of curiosity, why are you building links to the secure versions?
-
Thanks Vizergy
-
Makes sense - thanks
-
I have heard many times over the years that Google does not crawl/index secure pages... however, when performing a inURL:https search in Google you will see that this is certainly NOT true; they not only crawl, they index. So, if it is being crawled and indexed I would say there is a very real chance that it could cause a Dup issue.
I would set up a canonical tag to be safe.
-
Yes, you will have a duplication issue. You can work with both, but you will need to use the rel cannonical.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Duplicate Page Title
Hi I just got back from first crawl report and there were plenty of errors. I know this has been asked before but I am newbie here so bear with me. I captured the video. Any ideas on how to address the issue? ktXKDxRttK
Technical SEO | | mcardenal0 -
Duplicate Page Title
Our pages has so many DUPLİCATE PAGE TİTLE
Technical SEO | | iskq
I want to change all of them, is it right way?0 -
Need help with home page on site
Hello! Thanks for reading in advance! I've got a relatively old site (12 year old domain) that has experienced a drop in rankings specifically for our home page. Some of the key terms that I'd assume we would rank well for are: "expedite us passport" According to SEOMOZ, our on page optimization receives a C for the termr. also, the root domain and page have decent links, etc. However; looking at Google (logged out and in incognito mode in chrome), a page on our site http://www.passportsandvisas.com/passport/index.asp ranks well and our HOME page isn't listed in the top 50 or 100. THis is the case for a lot of keywords we used to rank well for. I would have thought our home page would have at least outranked an internal page. Any thoughts would be very, very helpful!
Technical SEO | | santiago230 -
Issue: Duplicate Page Content
Hi All, I am getting warnings about duplicate page content. The pages are normally 'tag' pages. I have some blog posts tagged with multiple 'tags'. Does it really affect my site?. I am using wordpress and Yoast SEO plugin. Thanks
Technical SEO | | KLLC0 -
How do I deal with my pages being seen as duplicate content by SeoMoz?
My Dashboard is giving my lots of warnings for duplicate content but it all seems to have something to do with the www and the slash / For example: http://www.ebow.ie/ is seen as having the same duplicate content as http:/ebow.ie/ and http://www.ebow.ie Alos lots to do with how Wordpress categorizes pages and tags that is driving me bonkers! Any help appreciated! Dave. seomoz.png
Technical SEO | | ebowdublin0 -
SEOMoz Crawl Diagnostic indicates duplicate page content for home page?
My first SEOMoz Crawl Diagnostic report for my website indicates duplicate page content for my home page. It lists the home page URL Page Title and URL twice. How do I go about diagnosing this? Is the problem related to the following code that is in my .htaccess file? (The purpose of the code was to redirect any non "www" backlink referrals to the "www" version of the domain.) RewriteCond %{HTTP_HOST} ^whatever.com [NC]
Technical SEO | | Linesides
RewriteRule ^(.*)$ http://www.whatever.com/$1 [L,R=301] Should I get rid of the "http" reference in the second line? Related to this is a notice in the "Crawl Notices Found" -- "301 Permanent redirect" which shows my home page title as "http://whatever.com" and shows the redirect address as http://http://www.whatever.com/ I'm guessing this problem is again related to the redirect code I'm using. Also... The report indicates duplicate content for those links that have different parameters added to the URL i.e. http://www.whatever.com?marker=Blah Blah&markerzoom=13 If I set up a canonical reference for the page, will this fix this? Thank you.0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240