Google Indexed the HTTPS version of an e-commerce site
-
Hi, I am working with a new e-commerce site. The way they are setup is that once you add an item to the cart, you'll be put onto secure HTTPS versions of the page as you continue to browse.
Well, somehow this translated to Google indexing the whole site as HTTPS, even the home page. Couple questions:
1. I assume that is bad or could hurt rankings, or at a minimum is not the best practice for SEO, right?
2. Assuming it is something we don't want, how would we go about getting the http versions of pages indexed instead of https? Do we need rel-canonical on each page to be to the http version? Anything else that would help?
Thanks!
-
let the people redirect to non https versions. what is the problem here? They won't loose items from their basket when redirected from https to http version. And when they are 'checking out', the protocol will remain secure via SSL as pages which need to be https wont redirect to non https.
-
Hi Irving, thanks for your reply. That all makes sense to me except "noindex meta tag them". They are the same product pages whether they are https or http, so I can't put 'noindex' on exclusively the https page...
Or are you suggesting that I figure out some conditional code that if the https version is called, it inserts a 'noindex'.
Is there a reason nobody is suggestion rel canonical to the http version?
-
block the https pages in robots.txt and nofindex meta tag them.
then make sure that all of your links coming off of the https pages are absolute http links.
Your problem is probably relative links on the https pages getting spidered and staying https when they come off the secure pages onto the http pages if that makes sense.
-
301'ing https versions would not work, because people who belong on the HTTPS versions (because they have something in their cart) would be force-redirected to the non-https version.
I'm thinking that rel-canonical to the http version along with Robots.txt rules as you've suggested may be the way to go.
-
1. it can create duplicate content issues and not a good seo practice.
2. You can 301 redirect all the https versions to the http versions and apply meta robots noindex, follow to the handful of pages that need to be https.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Json LD e-commerce site with Excellent implementation of all markup features
Hi all I am looking for some really good clear examples of sites that have excellent JSON LD markup. Not just the basics but packed to the teeth with markup for every element. I am particularly interested in e-commerce applications as I am re skinning our e-commerce platform written from scratch in house. It is far from perfect, not mobile friendly and well a bit backward but links into everything we have in a seamless way all the way to our manufacturing plant. Take a look have a little laugh and then take pity 🙂 https://www.spurshelving.co.uk/shop/shop.aspx Thanks Pete
Intermediate & Advanced SEO | | Eff-Commerce0 -
E-commerce site blog creating bad signals?
I have an e-commerce site with quite a large (subdirectory) blog attached. The blog is very successful, having attracted about 2 million visitors last year - almost 4 times that of our actual e-commerce pages. Although all content is tangentially relevant, the blog does not convert well directly (mostly because it attracts people at the wrong point in the funnel). Our average bounce rate on e-commerce pages is around 40%, while the blog is about 90% (it answers questions directly with some outbound links); and average page visits to e-commerce pages is 4, compared to 1.3 on the blog. I am concerned that this 80% of my traffic that does not often convert and leaves the site quickly, is costing me in rankings on the pages that do perform well. We recently re-released the e-commerce section of the site and despite cleaning up our structure and content, fixing bad URL structure etc., we saw little benefit. I am therefore considering taking the blog OFF our site and moving it elsewhere, linking back to the e-commerce site and allowing it to stand on its own two feet. Is this a bad idea? Thoughts?
Intermediate & Advanced SEO | | redtalons10 -
Blocking Certain Site Parameters from Google's Index - Please Help
Hello, So we recently used Google Webmaster Tools in an attempt to block certain parameters on our site from showing up in Google's index. One of our site parameters is essentially for user location and accounts for over 500,000 URLs. This parameter does not change page content in any way, and there is no need for Google to index it. We edited the parameter in GWT to tell Google that it does not change site content and to not index it. However, after two weeks, all of these URLs are still definitely getting indexed. Why? Maybe there's something we're missing here. Perhaps there is another way to do this more effectively. Has anyone else ran into this problem? The path we used to implement this action:
Intermediate & Advanced SEO | | Jbake
Google Webmaster Tools > Crawl > URL Parameters Thank you in advance for your help!0 -
JavaScript Issue? Google not indexing a microsite
We have a microsite that was created on our domain but is not linked to from ANYwhere EXCEPT within some Javascript elements on pages on our site. The link is in one JQuery slide panel. The microsite is not being indexed at all - when i do site:(microsite name) on Google, it doesn't return anything. I think it's because the link's only in a Java element, but my client assures me that if I submit to Google for crawling the problem will be solved. Maybe so, but my point is that if you just create a simple HTML link from at least one of our site pages, it will get indexed no problem. The microsite has been up for months and it's still not being indexed - another newer microsite that's been up for a few weeks and has simple links to it from our pages is indexing fine. I have submitted the URL for crawling but had to use the google.com/webmasters/tools/submit-url/ method as I don't have access to the top level domain WMT account. p.s. when we put the microsite URL into the SEOBook spider-test tool it returns lots of lovely information - but that just tells me the page is findable, does exist, right? That doesn't mean Google's going to necessarily index it, as I am surmising...Moz hasn't found in the 5 months the microsite has been up and running. What's going on here?
Intermediate & Advanced SEO | | Jen_Floyd0 -
How to get a site out of Google's Sandbox
Hi I am working on a website that is ranking well in bing for the domain name / exact url search but appears no where in Google or Yahoo. I have done the site search in Google and it is indexed so I am presuming it is in the sandbox. The website was originally developed in India and I do not know whether it had some history of bad backlinks. The website itself is well optimised and I have checked all pages in Moz - getting a grade A. Webmaster Tools is not showing any manual actions - I was wondering what I could do next?
Intermediate & Advanced SEO | | AllieMc0 -
Making a whole site website SSL (https)
Our IT department wants to make a change to our website and serve all the pages under SSL (https). This will be happening at the same time as a move from classic ASP to ASP.Net so the file extensions for non re-written urls will change (this doesn't equate to many). They will be ensuring everything is 301 redirected correctly. Even with this I can't help being very nervous about the change. We have tens of thousands of links to the website gained over many years, and I understand even with 301'ing them they will lose some of their value. We receive tens of thousands of natural visitors per day. Has anyone done anything like this before, or have any advice on whether it is the right thing to do?
Intermediate & Advanced SEO | | BigMiniMan0 -
Duplicity Problems - What to do with similar products in e-commerce?
Hello, I have an eCommerce website with hundreds of similar products. On some occasions, besides for their measurements they are completely identical. The titles are kept different by using the stock reference and the meta descriptions also use their measurements. However, I'm gettingDuplicate Page Content errors by the MOZ crawler. This is more than understandable since the products are very similar -
Intermediate & Advanced SEO | | BeytzNet
WHAT SHOULD I DO??? I noticed a similar situation in BlueNile (the diamond ecommerce site) - They have numerous almost identical pages, see example: http://www.bluenile.com/round-diamond-1-carat-or-less-ideal-cut-g-color-vs1-clarity_LD02424873 http://www.bluenile.com/round-diamond-1-carat-or-less-ideal-cut-g-color-vs1-clarity_LD02430168 For some reason, they did on each page a canonical to it's self... I wanted to add... It is impossible to add different descriptive texts due to the amount of products and to the rapidness they are sold (each product is unique - similar to the diamonds in the BlueNile example).0 -
How can I block unwanted urls being indexed on google?
Hi, I have to block unwanted urls (not that page) from being indexed on google. I have to block urls like example.com/entertainment not the exact page example.com/entertainment.aspx . Is there any other ways other than robot.txt? If i add this to robot.txt will that block my other url too? Or should I make a 301 redirection from example.com/entertainment to example.com/entertainment.aspx. Because some of the unwanted urls are linked from other sites. thanks in advance.
Intermediate & Advanced SEO | | VipinLouka780