Google Indexed the HTTPS version of an e-commerce site
-
Hi, I am working with a new e-commerce site. The way they are setup is that once you add an item to the cart, you'll be put onto secure HTTPS versions of the page as you continue to browse.
Well, somehow this translated to Google indexing the whole site as HTTPS, even the home page. Couple questions:
1. I assume that is bad or could hurt rankings, or at a minimum is not the best practice for SEO, right?
2. Assuming it is something we don't want, how would we go about getting the http versions of pages indexed instead of https? Do we need rel-canonical on each page to be to the http version? Anything else that would help?
Thanks!
-
let the people redirect to non https versions. what is the problem here? They won't loose items from their basket when redirected from https to http version. And when they are 'checking out', the protocol will remain secure via SSL as pages which need to be https wont redirect to non https.
-
Hi Irving, thanks for your reply. That all makes sense to me except "noindex meta tag them". They are the same product pages whether they are https or http, so I can't put 'noindex' on exclusively the https page...
Or are you suggesting that I figure out some conditional code that if the https version is called, it inserts a 'noindex'.
Is there a reason nobody is suggestion rel canonical to the http version?
-
block the https pages in robots.txt and nofindex meta tag them.
then make sure that all of your links coming off of the https pages are absolute http links.
Your problem is probably relative links on the https pages getting spidered and staying https when they come off the secure pages onto the http pages if that makes sense.
-
301'ing https versions would not work, because people who belong on the HTTPS versions (because they have something in their cart) would be force-redirected to the non-https version.
I'm thinking that rel-canonical to the http version along with Robots.txt rules as you've suggested may be the way to go.
-
1. it can create duplicate content issues and not a good seo practice.
2. You can 301 redirect all the https versions to the http versions and apply meta robots noindex, follow to the handful of pages that need to be https.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should m-dot sites be indexed at all
I have a client with a site with a m-dot mobile version. They will move it to a responsive site sometime next year but in meanwhile I have a massive doubt. This m-dot site has some 30k indexed pages in Google. Each of this page is bidirectionally linked to the www. version (rel="alternate on the www, rel canonical on the m-dot) There is no noindex on the m-dot site, so I understand that Google might decide to index the m-dot pages regardless of the canonical to the www site. But my doubts stays: is it a bad thing that both the version are indexed? Is this having a negative impact on the crawling budget? Or risking some other bad consequence? and how is the mobile-first going to impact on this? Thanks
Intermediate & Advanced SEO | | newbiebird0 -
Google slow to index pages
Hi We've recently had a product launch for one of our clients. Historically speaking Google has been quick to respond, i.e when the page for the product goes live it's indexed and performing for branded terms within 10 minutes (without 'Fetch and Render'). This time however, we found that it took Google over an hour to index the pages. we found initially that press coverage ranked until we were indexed. Nothing major had changed in terms of the page structure, content, internal linking etc; these were brand new pages, with new product content. Has anyone ever experienced Google having an 'off' day or being uncharacteristically slow with indexing? We do have a few ideas what could have caused this, but we were interested to see if anyone else had experienced this sort of change in Google's behaviour, either recently or previously? Thanks.
Intermediate & Advanced SEO | | punchseo0 -
Google suddenly indexing 1,000 fewer pages. Why?
We have a site, blog.example.org, and another site, www.example.org. The most visited pages on www.example.org were redesigned; the redesign landed May 8. I would expect this change to have some effect on organic rank and conversions. But what I see is surprising; I can't believe it's related, but I mention this just in case. Between April 30 and May 7, Google stopped indexing roughly 1,000 pages on www.example.org, and roughly 3,000 pages on blog.example.org. In both cases the number of pages that fell out of the index represents appx. 15% of the overall number of pages. What would cause Google to suddenly stop indexing thousands of pages on two different subdomains? I'm just looking for ideas to dig into; no suggestion would be too basic. FWIW, the site is localized into dozens of languages.
Intermediate & Advanced SEO | | hoosteeno0 -
Is robots met tag a more reliable than robots.txt at preventing indexing by Google?
What's your experience of using robots meta tag v robots.txt when it comes to a stand alone solution to prevent Google indexing? I am pretty sure robots meta tag is more reliable - going on own experiences, I have never experience any probs with robots meta tags but plenty with robots.txt as a stand alone solution. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart1 -
Why would one of our section pages NOT be indexed by Google?
One of our higher traffic section pages is not being indexed by Google. The products that reside on this section page ARE indexed by Google and are on page 1. So why wouldn't the section page be even listed and indexed? The meta title is accurate, meta description is good. I haven't received any notices in Webmaster Tools. Is there a way to check to see if OTHER pages might also not be indexed? What should a small ecom site do to see about getting it listed? SOS in Modesto. Ron
Intermediate & Advanced SEO | | yatesandcojewelers0 -
200 for Site Visitors, 404 for Google (but possibly 200?)
A 2nd question we have about another site we're working with... Currently if a visitor to their site accesses a page that has no content in a section, it shows a message saying that there is no information currently available and the page shows 200 for the user, but shows 404 for Google. They are asking us if it would be better to change the pages to 200's for Google and what impact that might have considering there would be different pages displaying the same 'no information here' message.
Intermediate & Advanced SEO | | Prospector-Plastics0 -
Blogs and E-Commerce websites
I have recently launched an e-commerce website which has a whopping domain authority of 1! I was thinking about adding a blog to it (it's in open cart), but that would mean creating it in a wordpress but using the same domain name. Would this be beneficial from an SEO stand point (i.e sending traffic to w blog that isn't actually on the e-commerce website itself) , or am I better off creating content as blogs/articles on other people sites?
Intermediate & Advanced SEO | | lindsayjhopkins0 -
How to remove an entire subdomain from the Google index with URL removal tool?
Does anyone have clear instructions for how to do this? Do we need to set up a separate GWT account for each subdomain? I've tried using the URL removal tool, but it will only allow me to remove URLs indexed under my domain (i.e. domain.com not subdomain.domain.com) Any help would be much appreciated!!!
Intermediate & Advanced SEO | | nicole.healthline0