Google cached https rather than http
-
Google is using a secure version of a page (https) that is meant to be displayed using only http. I don't know of any links to the page using https, but want to verify that. I only have 1 secure page on the site and it does not link to the page in question.
What is the easiest way to nail down why Google is using the https version?
-
Great answer! I'm a bit behind on using canonical tags, but that makes perect sense. Thanks so much!
-
Google adds pages based on following links. If your site offers a link to the page, or even if any other site offers a link, then it can be crawled and potentially indexed.
If you do not wish the page to be listed you have a few options. You can canonicalize the page to point to the non-secure version. For example let's assume you have the page http://www.mysite.com/info.html. Let us also assume you have a secure version of the page. On both pages I would recommend adding the following code:
That code tells Google the http:// version of the page is the preferred version, and the other page is a duplication.
If you do not have a duplicate page issue and you simply don't wish a page to be listed in SERP, then add the "noindex" tag to the page.
What is the easiest way to nail down why Google is using the https version?
Google will list any pages it finds. That's their role. Unless you specifically provide a reason for them not to index a page, such as the canonical tag, a noindex tag, or block them with robots.txt, they will list a page. They have no way of knowing you wanted the http:// version listed unless you tell them.
To fix the issue, you can add the canonical and the next time they check the page they will update their data. It may take a bit of time depending on the size of your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does my Google Web Cache Redirects to My Homepage?
Why does my Google Webcache appears in a short period of time and then automatically redirects to my homepage? Is there something wrong with my robots.txt? The only files that I have blocked is below: User-agent: * Disallow: /bin/ Disallow: /common/ Disallow: /css/ Disallow: /download/ Disallow: /images/ Disallow: /medias/ Disallow: /ClientInfo.aspx Disallow: /*affiliateId* Disallow: /*referral*
Technical SEO | | Francis.Magos0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
How can I get Google to forget an https version of one page on my site?
Google mysteriously decided to index the broken, https version of one page on my company's site (we have a cert for the site, but this page is not designed to be served over https and the CSS doesn't load). The page already has many incoming links to the http version, and it has a canonical URL with http. I resubmitted it on http with webmaster tools. Is there anything else I could do?
Technical SEO | | BostonWright0 -
Google showing a Cached option but then giving a 404
2 weeks ago my home page plus some others had a 301 redirect to another domain for about 1 week (due to a hack).The original pages were then de-indexed and the new bad domain was indexed and in effect stole my rankings.Then the 301 was removed/cleaned from my domain and the bad domain was fully de-indexed via a request I made (this was 1 week ago).Then my pages came back into the index but without any ranking power.Now when I perform a search for my domain my home page is listed with an option to view the Cache. Clicking on the Cache brings up a 404 error.So why is Google showing the Cached option but doesn't have the cached file? How do I get Google to properly update it's Cache or show a cached copy?
Technical SEO | | Dantek0 -
Http & https canonicalization issues
Howdyho I'm SEOing a daily deals site that mostly runs on https Versions. (only the home page is on http). I'm wondering what to do for canonicalization. IMO it would be easiest to run all pages on https. But the scarce resources I find are not so clear. For instance, this Youmoz blog post claims that https is only for humans, not for bots! That doesn't really apply anymore, right?
Technical SEO | | zeepartner0 -
Another http vs https Question?
Is it better to keep the Transaction/ Payment pages on a commercial website as the only secure ones (https) and remainder of website as http? Or is it better to have all the commercial website as secure (https)?
Technical SEO | | sherohass0 -
Having trouble removing homepage from google
For various reasons my client wants their homepage removed from google, no just the content of the page off but the page not to be indexed (yep strange request but we are mere service providers) today I requested in webmaster tool that default.asp was removed. Wht says done but the sites homepage is still listed. The page also has a no index tag on but 24 hours and 18k Google bot hits later it still remains. Anyone got any other suggestions to deindex just the homepage asap please
Technical SEO | | Grumpy_Carl0 -
What's the best way to deal with an entire existing site moving from http to https?
I have a client that just switched their entire site from the standard unsecure (http) to secure (https) because of over-zealous compliance issues for protecting personal information in the health care realm. They currently have the server setup to 302 redirect from the http version of a URL to the https version. My first inclination was to have them simply update that to a 301 and be done with it, but I'd prefer not to have to 301 every URL on the site. I know that putting a rel="canonical" tag on every page that refers to the http version of the URL is a best practice (http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=139394), but should I leave the 302 redirects or update them to 301's. Something seems off to me about the search engines visiting an http page, getting 301 redirected to an https page and then being told by the canonical tag that it's actually the URL they were just 301 redirected from.
Technical SEO | | JasonCooper0