HTTPS Certificate Expired. Website with https urls now still in index issue.
-
Hi Guys
This week the Security certificate of our website expired and basically we now have to wail till next Tuesday for it to be re-instated.
So now obviously our website is now index with the https urls, and we had to drop the https from our site, so that people will not be faced with a security risk screen, which most browsers give you, to ask if you are sure that you want to visit the site, because it's seeing it as an untrusted one.
So now we are basically sitting with the site urls, only being www...
My question what should we do, in order to prevent google from penalizing us, since obviously if googlebot comes to crawl these urls, there will be nothing.
I did however re-submitted it to Google to crawl it, but I guess it's going to take time, before Google picks up that now only want the www urls in the index.
Can somebody please give me some advice on this.
Thanks
Dave
-
My guess would be the person in charge of procuring and/or installing the cert took a day of vacation today and Monday things are closed for the Labor Day holiday... just guessing : )
That or just working it into their schedule. Sometimes at larger companies the people managing the website for SEO/Content/etc. are not the same people managing things on the back-end.
-
Why is it going to take 4 days for them to fix your SSL? That's the question I would want answered in your position. SSL certificates are easy to replace so what's the holdup here?
-
Hi Dave,
If I were in your shoes, I'd set up a rule to 302 redirect all of your https pages to their http equivalent until Tuesday when you get your cert.
This will make it so if anyone clicks on your https pages in Google's index that they will be brought to the appropriate page and not get a security warning.
The 302 will also tell Google "Hey, this is just a temporary redirect... don't worry about indexing things differently." - because if you get Google to index all of the http versions and if you don't 301 those, you will get a ton of 404 errors. Which is fine... but it gets messy fast.
Hope that helps.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is something wrong with my index after https switch?
I have previously switched sites to https but this one is behaving a little different. On September 19th I switched to https. I did 301 redirects at the .htaccess, added it to search console, and since we are using Magento I changed the base url. In the past when I have done this the http site index just gradually drops while the https site gradually rises. In early October the http site started to slightly drop but since 10/23 there have been no changes. For the https site it started to increase up until 10/23 then stayed flat. Why have they stayed stuck like that for a month?
Intermediate & Advanced SEO | | Tylerj0 -
Is there a way to no index no follow sections on a page to avoid duplicative text issues?
I'm working on an event-related site where every blog post starts with an introductory header about the event and then a Call To Action at the end which gives info about the Registration Deadline. I'm wondering if there is something we can and should do to avoid duplicative content penalties. Should these go in a widget or is there some way to No Index, No Follow a section of text? Thanks!
Intermediate & Advanced SEO | | Spiral_Marketing0 -
Removing Parameterized URLs from Google Index
We have duplicate eCommerce websites, and we are in the process of implementing cross-domain canonicals. (We can't 301 - both sites are major brands). So far, this is working well - rankings are improving dramatically in most cases. However, what we are seeing in some cases is that Google has indexed a parameterized page for the site being canonicaled (this is the site that is getting the canonical tag - the "from" page). When this happens, both sites are being ranked, and the parameterized page appears to be blocking the canonical. The question is, how do I remove canonicaled pages from Google's index? If Google doesn't crawl the page in question, it never sees the canonical tag, and we still have duplicate content. Example: A. www.domain2.com/productname.cfm%3FclickSource%3DXSELL_PR is ranked at #35, and B. www.domain1.com/productname.cfm is ranked at #12. (yes, I know that upper case is bad. We fixed that too.) Page A has the canonical tag, but page B's rank didn't improve. I know that there are no guarantees that it will improve, but I am seeing a pattern. Page A appears to be preventing Google from passing link juice via canonical. If Google doesn't crawl Page A, it can't see the rel=canonical tag. We likely have thousands of pages like this. Any ideas? Does it make sense to block the "clicksource" parameter in GWT? That kind of scares me.
Intermediate & Advanced SEO | | AMHC0 -
Redesigned Website
Hi, I have redesigned my website in html whereas it was in .asp earlier. I have resubmitted my google sitemap but it still showing me old site pages in search except home page. My question is how i can immediately change my web presence. How i can get the benefit of my .asp page ranks? In addition my website still alive with .asp. What should be the strategy, should i remove this websites or have to redirect all pages to new. If i make 301 redirection then will it cause any issue in SEO, ranking etc ? Thx
Intermediate & Advanced SEO | | 1akal0 -
How to Index Faster?
Hello, I have a new website and updated fresh content regularly. My indexing status is very slow. When I search how to improve my indexing rate by Google, I found most of the members of Moz community replied there is no certain technique to improve your indexing. Apart from this you should keep posting fresh content more and more and wait for Google Indexing. Some of them asked for submitting sitemap and share posts on Twitter, Facebook and Google Plus. Well the above comments are from the year of 2012. I'm curious to know is there any new technique or methods are used to improve indexing rate? Need your suggestions! Thanks.
Intermediate & Advanced SEO | | TopLeagueTechnologies0 -
Google is Really Slow to Index my New Website
(Sorry for my english!) A quick background: I had a website at thewebhostinghero.com which had been slapped left and right by Google (both Panda & Penguin). It also had a manual penalty for unnatural links which had been lifted in late april / early may this year. I also had another domain, webhostinghero.com, which was redirecting to thewebhostinghero.com. When I realized I would be better off starting a new website than trying to salvage thewebhostinghero.com, I removed the redirection from webhostinghero.com and started building a new website. I waited about 5 or 6 weeks before putting any content on webhostinghero.com so Google had time to notice that the domain wasn't redirecting anymore. So about a month ago, I launched http://www.webhostinghero.com with 100% new content but I left thewebhostinghero.com online because it still brings a little (necessary) income. There are no links between the websites except on one page (www.thewebhostinghero.com/speed/) which is set to "noindex,nofollow" and is disallowed to search engines in robots.txt. I made sure the web page was deindexed before adding a "nofollow" link from thewebhostinghero.com/speed => webhostinghero.com/speed Since the new website launch, I've been publishing new content (from 2 to 5 posts) daily. It's getting some traction from social networks but it gets barely any clicks from Google search. It seems to take at least a week before Google indexes new posts and not all posts are indexed. The cached copy of the homepage is 12 days old. In Google Webmaster Tools, it looks like Google isn't getting the latest sitemap version unless I resubmit it manually. It's always 4 or 5 days old. So is my website just too young or could it have some kind of penalty related to the old website? The domain has 4 or 5 really old spammy links from the previous domain owner which I couldn't get rid of but otherwise I don't think there's anything tragic.
Intermediate & Advanced SEO | | sbrault740 -
Why the archive sub pages are still indexed by Google?
Why the archive sub pages are still indexed by Google? I am using the WordPress SEO by Yoast, and selected the needed option to get these pages no-index in order to avoid the duplicate content.
Intermediate & Advanced SEO | | MichaelNewman1 -
Regional websites
Hi, I run 4 websites London, New York, Singapore and Dubai. Same company but some of our products are different in each region. Each domain is registered in the relevant region and I have google webmaster tools set so they know the location of each website. The problem is that our Dubai and US websites are appearing higher that the UK website in google.co.uk organic. Does anyone have any ideas why? Thanks
Intermediate & Advanced SEO | | markc-1971830