Should I add the canonical URL/homepage (http://www.domain.com) in the secured version of my home URL (https://www.domain.com)?
-
Thanks in advance!
-
Thanks for the help George!
-
Hi,
I understand the question as: In the SSL (HTTPS) version of my homepage, should I add a rel=canonical link to the markup which points to the non SSL version of my homepage?
If your SSL pages are only accessible to authenticated users (i.e. not crawlers) then I can't see that it would make much difference as you won't suffer from duplicate content. However, if your SSL page is accessible to crawlers (as is becoming more common recently) then adding the canonical tag to non SSL is a good idea. In addition to preventing duplicate content issues, there's a good chance that your SSL page might get linked to, and blocking crawlers to it (using noindex / robots etc) means you won't get the benefits of those links.
One thing to bear in mind first is that you should decide on whether the single canonical version for your site is your HTTP or HTTPS pages. Then canonicalise accordingly.
George
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I show different versions of the same page to the crawlers and users, but do not want to do anymore
Hello, While Google could not read JavaScript, I created two versions of the same page, one of them is for human and another is for Google. Now I do not want to serve different content to the search engine. But, I am worry if I will lose my traffic value. What is the best way to succeed it without loss? Can you help me?
White Hat / Black Hat SEO | | kipra0 -
Site build in the 80% of canonical URLs - What is the impact on visibility?
Hey Everyone, I represent international wall decorations store where customer can freely choose a pattern to be printed on a given material among a few milions of patterns. Due to extreme large number of potential URL combinations we struggle with too many URL adressess for a months now (search console notifications). So we finally decided to reduce amount of products with canonical tag. Basing on users behavior, our business needs and monthly search volume data we selected 8 most representative out of 40 product categories and made them canonical toward the rest. For example: If we chose 'Canvas prints' as our main product category, then every 'Framed canvas' product URL points rel=canonical tag toward its equivalent URL within 'Canvas prints' category. We applied the same logic to other categories (so "Vinyl wall mural - Wild horses running" URL points rel=canonical tag to "Wall mural - Wild horses running" URL, etc). In terms of Googlebot interpretation, there are really tiny differences between those Product URLs, so merging them with rel=canonical seems like a valid use. But we need to keep those canonicalised URLs for users needs, so we can`t remove them from a store as well as noindex does not seem like an good option. However we`re concerned about our SEO visibility - if we make those changes, our site will consist of ~80% canonical URLs (47,5/60 millions). Regarding your experience, do you have advices how should we handle that issue? Regards
White Hat / Black Hat SEO | | _JediMindBender
JMB0 -
Recovering from Black Hat/Negative SEO with a twist
Hey everyone, This is a first for me, I'm wondering if anyone has experienced a similar situation and if so, what the best course of action was for you. Scenario In the process of designing a new site for a client, we discovered that his previous site, although having decent page rank and traffic had been hacked. The site was built on Wordpress so it's likely there was a vulnerability somewhere that allowed someone to create loads of dynamic pages; www.domain.com/?id=102, ?id=103, ?id=104 and so on. These dynamic pages ended up being malware with a trojan horse our servers recognized and subsequently blocked access to. We have since helped them remedy the vulnerability and remove the malware that was creating these crappy dynamic pages. Another automated program appears to have been recently blasting spam links (mostly comment spam and directory links) to these dynamically created pages at an incredibly rapid rate, and is still actively doing so. Right now we're looking at a small business website with a touch over 500k low-quality spammy links pointing to malware pages from the previously compromised site. Important: As of right now, there's been no manual penalty on the site, nor has a "This Site May Have Been Compromised" marker in the organic search results for the site. We were able to discover this before things got too bad for them. Next Steps? The concern is that when the Penguin refresh occurs, Google is going to notice all these garbage links pointing to those malware pages and then potentially slap a penalty on the site. The main questions I have are: Should we report this proactively to the web spam team using the guidelines here? (https://www.google.com/webmasters/tools/spamreport?hl=en&pli=1) Should we request a malware review as recommended within the same guidelines, keeping in mind the site hasn't been given a 'hacked' snippet in the search results? (https://support.google.com/webmasters/topic/4598410?hl=en&ref_topic=4596795) Is submitting a massive disavow links file right now, including the 490k-something domains, the only way we can escape the wrath of Google when these links are discovered? Is it too hopeful to imagine their algorithm will detect the negative-SEO nature of these links and not give them any credit? Would love some input or examples from anyone who can help, thanks in advance!
White Hat / Black Hat SEO | | Etna0 -
Is there any value in "starting from scratch" on a new domain?
Hi, Our ecommerce store - we have had some duplicate content issues and they have been corrected, but of course, Google takes time to pick up on these. Our link profile is very poor, so we wont lose a lot by going to a new domain in that sense. My question is, in what instances is it worthwhile starting under a new domain? And in which not? Presumably you can also 301 the whole site - when is it worth doing this or not? Thanks, Ben
White Hat / Black Hat SEO | | bjs20100 -
Multiple domains different content same keywords
what would you advice on my case: It is bad for google if i have the four domains. I dont link between them as i dont want no association, or loss in rakings in branded page. Is bad if i link between them or the non branded to them branded domain. Is bad if i have all on my webmaster tools, i just have the branded My google page is all about the new non penalized domain. altough google gave a unique domain +propdental to the one that he manually penalized. (doesn't make sense) So. What are the thinks that i should not do with my domain to follow and respect google guidelines. As i want a white hat and do not do something that is wrong without knowledge
White Hat / Black Hat SEO | | maestrosonrisas0 -
What to do about all of the other domains we own?
So I had asked this question a while back in a previous thread and thought I had the correct answer to it, but just actually heard differently on a webinar by Dr. Pete. Basically, we have a large number of domains that just replicate our website. Some are brand names, some are exact match keyword domains, some are clever plays on words. This is a tactic that our marketing department thought was a good idea. Obviously its not. My question is - Some of these domains actually have a significant amount of link value coming into them. How people found them I'm not sure, but nonetheless, I want to try to take advantage of the incoming links somehow if possible. Dr. Pete recommended against 301 redirecting back to our main domain all at once because that would be a signal to Google that something fishy is going on. This is what I was going to do, but now I'm really not sure what to do now... If possible, it would be great to get Dr. Pete in this thread to get his comments. I wasn't able to get an answer on the SEO in 2012 Pro Webinar.
White Hat / Black Hat SEO | | CodyWheeler0