Google Indexing Of Pages As HTTPS vs HTTP
-
We recently updated our site to be mobile optimized. As part of the update, we had also planned on adding SSL security to the site. However, we use an iframe on a lot of our site pages from a third party vendor for real estate listings and that iframe was not SSL friendly and the vendor does not have that solution yet. So, those iframes weren't displaying the content.
As a result, we had to shift gears and go back to just being http and not the new https that we were hoping for.
However, google seems to have indexed a lot of our pages as https and gives a security error to any visitors. The new site was launched about a week ago and there was code in the htaccess file that was pushing to www and https. I have fixed the htaccess file to no longer have https.
My questions is will google "reindex" the site once it recognizes the new htaccess commands in the next couple weeks?
-
That's not going to solve your problem, vikasnwu. Your immediate issue is that you have URLs in the index that are HTTPS and will cause searchers who click on them not to reach your site due to the security error warnings. The only way to fix that quickly is to get the SSL certificate and redirect to HTTP in place.
You've sent the search engines a number of very conflicting signals. Waiting while they try to work out what URLs they're supposed to use and then waiting while they reindex them is likely to cause significant traffic issues and ongoing ranking harm before the SEs figure it out for themselves. The whole point of what I recommended is it doesn't depend on the SEs figuring anything out - you will have provided directives that force them to do what you need.
Paul
-
Remember you can force indexing using Google Search Console
-
Nice answer!
But you forgot to mention:
- Updating the sitemap files with the good URLs
- Upload them to Google Search Console
- You can even force the indexing at Google Search Console
Thanks,
Roberto
-
Paul,
I just provided the solution to de-index the https version. I understood that what's wanted, as they need their client to fix their end.And of course that there is no way to noindex by protocol. I do agree what you are saying.
Thanks a lot for explaining further and prividing other ways to help solvinf the issue, im inspired by used like you to help others and make a great community.
GR.
-
i'm first going to see what happens if I just upload a sitemap with http URLs since there wasn't a sitemap in webmaster tools from before. Will give you the update then.
-
Great! I'd really like to hear how it goes when you get the switch back in.
P.
-
Paul that does make sense - i'll add the SSL certificate back, and then redirect from https to http via the htaccess file.
-
You can't noindex a URL by protocol, Gaston - adding no-index would eliminate the page from being returned as a search result regardless of whether HTTP or HTTPS, essentially making those important pages invisible and wasting whatever link equity they may have. (You also can't block in robots.txt by protocol either, in my experience.)
-
There's a very simple solution to this issue - and no, you absolutely do NOT want to artificially force removal of those HTTPS pages from the index.
You need to make sure the SSL certificate is still in place, then re-add the 301-redirect in the site's htaccess file, but this time redirecting all HTTPS URLs back their HTTP equivalents.
You don't want to forcibly "remove" those URLs from the SERPs, because they are what Google now understands to be the correct pages. If you remove them, you'll have to wait however long it takes for Google and other search engines to completely re-understand the conflicting signals you've sent them about your site. And traffic will inevitably suffer in that process. Instead, you need to provide standard directives that the search engines don't have to interpret and can't ignore. Once the search engines have seen the new redirects for long enough, they'll start reverting the SERP listings back to the HTTP URLs naturally.
The key here is the SSL cert must stay in place. As it stands now, a visitor clicking a page in the search engine is trying to make an HTTPS connection to your site. If there is no certificate in place, they will get the harmful security warning. BUT! You can't just put in a 301-redirect in that case. The reason for this is that the initial connection from the SERP is coming in over the "secure channel". That connection must be negotiated securely first, before the redirect can even be read. If that first connection isn't secure, the browser will return the security warning without ever trying to read the redirect.
Having the SSL cert in place even though you're not running all pages under HTTPS means that first connection can still be made securely, then the redirect can be read back to the HTTP URL, and the visitor will get to the page they expect in a seamless manner. And search engines will be able to understand and apply authority without misunderstandings/confusion.
Hope that all makes sense?
Paul
-
Noup, Robots.txt works on a website level. This means that there has to be a file for the http and another for the https website.
And, there is no need for waiting until the whole site is indexed.Just to clarify, robots.txt itself does not remove pages already indexed. It just blocks bots from crawling a website and/or specific pages with in it.
-
GR - thanks for the response.
Given our site is just 65 pages, would it make sense to just put all of the site's "https" URLs in the robots.txt file as "noindex" now rather than waiting for all the pages to get indexed as "https" and then remove them?
And then upload a sitemap to webmaster tools with the URLS as "http://"?
VW
-
Hello vikasnwu,
As what you are looking for is to remove from index the pages, follow this steps:
- Allow the whole website to be crawable in the robots.txt
- add the robots meta tag with "noindex,follow" parametres
- wait several weeks, 6 to 8 weeks is a fairly good time. Or just do a followup on those pages
- when you got the results (all your desired pages to be de-indexed) re-block with robots.txt those pages
- DO NOT erase the meta robots tag.
Remember that http://site.com andhttps://site.com are different websites to google.
When your client's website is fixed with https, follow these steps:- Allow the whole website (or the parts wanted to be indexed) to be crawable in robots.txt
- Remove the robots meta tag
- Redirect 301 http to https
- Sit and wait.
Information about the redirection to HTTPS and a cool checklist:
The Big List of SEO Tips and Tricks for Using HTTPS on Your Website - Moz Blog
The HTTP to HTTPs Migration Checklist in Google Docs to Share, Copy & Download - AleydaSolis
Google SEO HTTPS Migration Checklist - SERoundtableHope I'm helpful.
Best luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTP URL hangover after move to HTTPS
A clients site was moved to https recently. It's a small site with only 6 pages. One of the pages is to advertise an emergency service. HTTPS move worked fine. Submitted https to webmaster tools, submitted sitemap. 301 redirects. Rankings preserved. However, a few weeks later doing the site:example.com there are two pages for the emergency service. One says https the other is http. But the http one says the correct SEO title and the https one says an old SEO title. This wasn't expected. When you click the HTTP URL link it 301 redirects to the HTTPS url and the correct SEO title is displayed in the browser tab. When you click the HTTPS url link it returns a 200 and the correct SEO title is shown as expected in the browser tab. Anyone have any idea what is going on? And how to fix? Need to get rid of the HTTP URL but in the site search it contains the correct title. Plus- why is it there anyway?
Intermediate & Advanced SEO | | AL123al0 -
Losing backlinks between http and https
Late last year, Shopify moved all their sites from http to https. I did an audit of a Shopify site recently and discovered the following... The https version of the site has just 13 backlinks from 5 domains The http version of the site has 568 backlinks from 48 domains So I went into the blog (which is on a different domain - long story) and changed all the backlinks from http to https. One week later... the https version of the site now has 278 backlinks from the same 5 domains. I've been told that Google doesn't worry about this when it comes to rankings. Not sure if this is true or not. But I definitely believe this DOES affect MOZ Domain Authority. Can anybody confirm or deny this? If the backlinks do not change from http to https following a migration, does this impact MOZ Domain Authority and/or Google rankings?
Intermediate & Advanced SEO | | muzzmoz0 -
Google only indexing the top 2/3 of my page?
HI, I have a page that is about 5000 lines of code total. I was having difficulty figuring out why the addition of a lot of targeted, quality content to the bottom of the pages was not helping with rankings. Then, when fetching as Google, I noticed that only about 3300 lines were getting indexed for some reason. So naturally, that content wasn't going to have any effect if Google in not seeing it. Has anyone seen this before? Thoughts on what may be happening? I'm not seeing any errors begin thrown by the page....and I'm not aware of a limit of lines of code Google will crawl. Pages load under 5 seconds so loading speed shouldn't be the issue. Thanks, Kevin
Intermediate & Advanced SEO | | yandl1 -
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
Can you no index a page in Wordpress from just Google news?
I'm trying to find a plugin for Wordpress that enables you to no-index an individual page from Google news but not from Google search results. We want to remove some of our pages from Google news without hurting others.
Intermediate & Advanced SEO | | uSw0 -
Client has moved to secured https webpages but non secured http pages are still being indexed in Google. Is this an issue
We are currently working with a client that relaunched their website two months ago to have hypertext transfer protocol secure pages (https) across their entire site architecture. The problem is that their non secure (http) pages are still accessible and being indexed in Google. Here are our concerns: 1. Are co-existing non secure and secure webpages (http and https) considered duplicate content?
Intermediate & Advanced SEO | | VanguardCommunications
2. If these pages are duplicate content should we use 301 redirects or rel canonicals?
3. If we go with rel canonicals, is it okay for a non secure page to have rel canonical to the secure version? Thanks for the advice.0 -
Page indexed but not showing up at all in search results
I am currently working on the SEO for a roofing company. I have developed GEO targeted pages for both commercial and residential roofing (as well as attic insulation and gutters) and have hundreds of 1st page placements for the GEO targeted keywords. What is baffling me is that they are performing EXTREMELY poorly on the bigger cities, to the point of not evening showing up in the first 5 pages. I also target a page specifically for roof repair in Phoenix and it is not coming up AT ALL. This is not typically the results I get when directly targeting keywords. I'm working on implementing keyword variations as well as adding about 10 or so information pages (@ 700 words) regarding different roofing systems which I plan to cross link on the site, etc. I'm just wondering if there is a simple answer as to why the pages I want to be showing up the most are performing so poorly and what I would need to do to improve their rankings.
Intermediate & Advanced SEO | | dogstarweb0 -
No index, follow vs. canonical url
We have a site that consists almost entirely as a directory of videos. Example here: http://realtree.tv/channels/realtreeoutdoorsclassics We're trying to figure out the best way to handle pagination and utility features such as sort for most recent, most viewed, etc. We've been reading countless articles on this topic, but so far have been unable to determine what might be considered the industry standard. Two solutions seem to stand out... Using the canonical url on all the sorted and paginated pages. However, after reading many blog posts, it seems that you should NEVER use the canonical url to solve the issue of paginated, and thus duplicated content because the search bots will never crawl past the first page leaving many results not in the index. (We are considering ruling this method out.) Another solution seems to be using the meta tag for noindex, follow so that a search engine like Google will crawl your directory pages but not add them to the index themselves. All links are followed so content is crawled and any passing link juice remains unchanged. However, I did see a few articles skeptical of this solution as well saying that there are always better alternatives, or that there is no verification that search engines obey this meta tag. This has placed some doubt in our minds. I was hoping to get some expert advice on these methods as it would pertain to our site. Thank you.
Intermediate & Advanced SEO | | grayloon0