Google Indexed the HTTPS version of an e-commerce site
-
Hi, I am working with a new e-commerce site. The way they are setup is that once you add an item to the cart, you'll be put onto secure HTTPS versions of the page as you continue to browse.
Well, somehow this translated to Google indexing the whole site as HTTPS, even the home page. Couple questions:
1. I assume that is bad or could hurt rankings, or at a minimum is not the best practice for SEO, right?
2. Assuming it is something we don't want, how would we go about getting the http versions of pages indexed instead of https? Do we need rel-canonical on each page to be to the http version? Anything else that would help?
Thanks!
-
let the people redirect to non https versions. what is the problem here? They won't loose items from their basket when redirected from https to http version. And when they are 'checking out', the protocol will remain secure via SSL as pages which need to be https wont redirect to non https.
-
Hi Irving, thanks for your reply. That all makes sense to me except "noindex meta tag them". They are the same product pages whether they are https or http, so I can't put 'noindex' on exclusively the https page...
Or are you suggesting that I figure out some conditional code that if the https version is called, it inserts a 'noindex'.
Is there a reason nobody is suggestion rel canonical to the http version?
-
block the https pages in robots.txt and nofindex meta tag them.
then make sure that all of your links coming off of the https pages are absolute http links.
Your problem is probably relative links on the https pages getting spidered and staying https when they come off the secure pages onto the http pages if that makes sense.
-
301'ing https versions would not work, because people who belong on the HTTPS versions (because they have something in their cart) would be force-redirected to the non-https version.
I'm thinking that rel-canonical to the http version along with Robots.txt rules as you've suggested may be the way to go.
-
1. it can create duplicate content issues and not a good seo practice.
2. You can 301 redirect all the https versions to the http versions and apply meta robots noindex, follow to the handful of pages that need to be https.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Structure For E-commerce Sites
Hi Guys, I was wondering what would be the optimal and best URL structure for sub-categories on a E-commerce site for SEO purposes. Example if my category was dresses and I had multiple sub-categories within dresses would 1 or 2 below be the better URL structure? 1) Domain + Category + Sub-Category be the most suitable URL structure: Sleeveless Dresses URL: clothingstore.com/dresses/sleeveless-dresses Midi Dresses URL: clothingstore.com/dresses/midi-dresses 2) OR would excluding the category be better Domain + Sub-Category like: Sleeveless Dresses URL: clothingstore.com/sleeveless-dresses Midi Dresses URL: clothingstore.com/midi-dresses Do you think it makes much of a difference, is shorter better and more effective in this case? E.g. Rand discuses in this article: https://moz.com/blog/15-seo-best-practices-for-structuring-urls that having the keyword in the URL serves as anchor text, so wouldn't having additional keywords dilute value in this case? Plus he mentions shorter URLs the better. Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright1 -
Disavow both www. and non www. version of site?
I just submitted my disavow file to Google after several months of work. A few months ago a saw a partial match unnatural link penalty in the www. version of my site's Search Console account. The penalty has since expired. Should I also upload the file to the non www. side of the Search Console account? No penalty ever appeared there.
Intermediate & Advanced SEO | | pajamalady0 -
Blocking Certain Site Parameters from Google's Index - Please Help
Hello, So we recently used Google Webmaster Tools in an attempt to block certain parameters on our site from showing up in Google's index. One of our site parameters is essentially for user location and accounts for over 500,000 URLs. This parameter does not change page content in any way, and there is no need for Google to index it. We edited the parameter in GWT to tell Google that it does not change site content and to not index it. However, after two weeks, all of these URLs are still definitely getting indexed. Why? Maybe there's something we're missing here. Perhaps there is another way to do this more effectively. Has anyone else ran into this problem? The path we used to implement this action:
Intermediate & Advanced SEO | | Jbake
Google Webmaster Tools > Crawl > URL Parameters Thank you in advance for your help!0 -
Why are bit.ly links being indexed and ranked by Google?
I did a quick search for "site:bit.ly" and it returns more than 10 million results. Given that bit.ly links are 301 redirects, why are they being indexed in Google and ranked according to their destination? I'm working on a similar project to bit.ly and I want to make sure I don't run into the same problem.
Intermediate & Advanced SEO | | JDatSB1 -
302 Redirect of www. version of a site - Pros/Cons
Hi, I am considering making the 301 redirect from the domain to a 302 temporary redirect. Currently if a user lands on "www" version of the site, they are redirected to the non "www" version. But after the redirect, they will land on an external webpage (so if a user lands on the "www" version, they are redirected to a different website, not related to my domain). Reason I'm considering this is because I have received a large number of spammy backlinks on the "www" version of the site (negative seo). So I'm hoping that the temporary redirect will help me recover. Your thoughts on this: 1. Is this the best way to do a 302 redirect (redirecting the www version to an external domain)? 2. Will the redirect help the main domain recover, considering all spammy backlinks are pointing to the www version? 3. What are the pros/cons, if any? Thanks in advance for your thoughts! Howard
Intermediate & Advanced SEO | | howardd0 -
Google penalized site--307/302 redirect to new site-- Via intermediate link—New Site Ranking Gone..?
Hi, I have a site that google had placed a manual link penalty on, let’s call this our
Intermediate & Advanced SEO | | Robdob2013
company site. We tried and tried to get the penalty removed, and finally gave up and purchased another name. It was our understanding that we could safely use either a 302 or 307 temporary redirect in order to redirect people from our old domain to our new one.. We put this into place several months and everything seemed to be going along well. Several days ago I noticed that our root domain name had dropped for our selected keyword from position 9 to position 65. Upon looking into our GWT under “Links to Your site” , I have found many, many, many links which were pointed to our old google penalized domain name to our new root domain name each of this links had a sub heading “Via this intermediate link -> Our Old Domain Google Penalized Domain Name” In light of all of this going on, I have removed the 307/302 redirect, have brought the
old penalized site back which now consists of a basic “we’ve moved page” which is linked to our new site using a rel=’nofollow’ I am hoping that -1- Our new domain has probably not received a manual penalty and is most likely now
received some sort of algorithmic penalty, and that as these “intermediate links” will soon disappear because I’m no longer doing the 302/307 from the old sight to the new. Do you think this is the case now or that I now have a new manual penalty place on the new
domain name.. I would very much appreciate any comments and/or suggestions as to what I should or can do to get this fixed. I need to still keep the old domain name as this address has already been printed on business cards many, many years ago.. Also on a side note some of the sub pages of the new root domain are still ranking very
well, it’s only the root domain that is now racking awfully.. Thanks,0 -
Getting Google to index MORE per day than it does, not with greater frequency nec.
Hi The Googlebot seems to come around healthily, every day we see new pages that we've written the week before get ranked, however, if we are adding 12-15 new products/blog entries/content bits each day, only about 2-3 ever get indexed per day and so, after a few weeks, this builds up to quite a time lag. Is there any way to help step up the amount of new pages that get indexed every day? It really will take 2 or 3 each day, but no more than that, it seems strange. We're fairly new, around 6 months creating content but domain name 18 months old. Will this simply improve over time, or can something be done to help google index those pages? We dont mind if the 15 we do on Monday all get indexed the following Monday for example?
Intermediate & Advanced SEO | | xoffie0 -
My site links have gone from a mega site links to several small links under my SERP results in Google. Any ideas why?
A site I have currently had the mega site links on the SERP results. Recently they have updated the mega links to the smaller 4 inline links under my SERP result. Any idea what happened or how do I correct this?
Intermediate & Advanced SEO | | POSSIBLE0