Https Loss of Search traffic
-
Hey guys,
We moved our site to from http to https. We subsequently lost 25% in our search traffic in 1 Month. We changed a few other pieces such as images, added new content etc.
Has anyone got any suggestions on how we start to understand what happened?
Thanks in advance.
-
I would also set up your HTTPS site as a new one in Google Webmaster Tools and Bing Webmaster Tools (Bing automatically does it), but set it up in GWT. Once you have done that, make sure your sitemap is added and crawled again.
Get GWT to crawl the sitemap again from your non HTTPS account too, so redirects are recorded.
-
Hi
I can see that Tom and Ryan have done a great job of handling the most obvious issue that may have caused your site to drop in terms of changing to HTTPS. You have also mentioned the fact that you have changed some significant on-page factors which could definitely be the culprit.
However I thought I would just add this - I saw someone move to HTTPS and they did implement the correct redirects and then register both versions of the site in webmaster tools and so on. However they still saw a noticeable drop in their organic traffic which turned out to be caused by a significant drop in site speed after moving to HTTPS.
You may have already checked and your site may seem fast but I would double check the speed of your site with Google Page Speed Insights and GTMetrix (another speed tool I like). If your site seems slow look at what you can do to improve it with the recommendations on these tools (as this will help your site whatever) - also speak to your host and explain you are seeing a decrease in speed can they help.
This tactic worked in this case and I just thought I would mention it as another angle to approach this.
Hope this helps!
-
Great! It sounds like you're on the right track John, especially considering that there are some title and H1 changes took place. Cheers!
-
Hey Ryan,
Thanks for the answer and the link, We will add it to that pingfarm. Thanks guys but we need to make sure this is in order.
-
Hi Tom,
Thanks a million for the great answer. I missed that in my description, we have added in 301 redirects for all links but still saw a drop. My apologies, I left that out initially.
We have also identified that we are missing a number of H1's and have changed our title tags.
-
To follow up on Tom's great response, even with the 301 redirect in place, you could still be experiencing the delay typical to redirection. Pinging the new domain may help you speed up that process. To do so create a list of the site pages and submit them to PingFarm and 247pinger. Those will help to verify the new site and de-index the old URLs.
Also, as a reference, here's Google's over view of transitioning to HTTPS: https://support.google.com/webmasters/answer/6073543. Cheers!
-
Hi John
The first thing I would ask in this situation is - did you 301 redirect the non SSL (http) versions of your pages to the SSL (https)?
If you didn't - that could be a big problem. First of all, that could run into a couple of duplicate content issues and canonical issues, as you would have two versions of each page battling each other for prominence. However, what I think would be more of an issue is the fact that if you didn't redirect the pages and if you didn't change any of your inbound links - the links that you had pointing to the website might now be pointing to inactive pages, and the link 'equity' your site had may now be lost.
This might be a redundant point as you may have implemented the redirects - but it's always the first thing that I look out for.
If you're looking to redirect all of the http pages, this bit of code is something you can add to your htaccess file to redirect all URLs on your site:
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI}Again, apologies if you have already done this but if not, I hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
50% Organic Traffic Drop In the last 48 Hours
Hello, My site had a 50% decrease in the last 48 hours (9/26/18) and I looking for ideas/reasons what would cause such a dramatic drop. Year to year organic traffic has been up 40% and September was up 30%. The site has a domain authority of 39 according to Moz and keywords positions have been flat for a few months. I made a change to the code and robots.txt file on Monday, pre-drop. The category pagination pages had a "NoIndex" with a rel =canonical and I removed the "NoIdnex" per: https://www.seroundtable.com/google-noindex-rel-canonical-confusion-26079.html. I also removed "Disallow" in the robots.txt for stuff like "/?dir" because the pages have the rel =canonical. Could this be the reason for drop?? Other possible reasons:
Intermediate & Advanced SEO | | chuck-layton
1. Google Update: I dont think this is it, but ti looks like the last one was August 1st: "Medic" Core Update — August 1, 2018
2. Site was hacked
3. All of keyword positions dropped overnight: I dont think this is it because Bing has also dropped at the same percentage. Any help, thoughts or suggestions would be awesome.0 -
HTTPS - implementation question
Hello, I am looking at a site on which they haven't 301'd http to https, so each URL is there whether you have http or https at the beginning. Why would a site owner not 301 to https? Is there any logical reason not to use 301? This particular website is simply using a canonical tag to point to the https version of each URL.
Intermediate & Advanced SEO | | McTaggart0 -
Website Traffic Is Down
Hi, My Website www.financeninvestments.com is down for almost now 2 years. I was receiving the good traffic before this but now the traffic is almost down. I want to again do something to get my Traffic back with some consistent efforts. So what efforts should i do to make this back.Pls suggest.
Intermediate & Advanced SEO | | rahulsoni250 -
Indexation of internal search results from infinite scroll
Hello, I have an issue where we will have a website set up with dynamic (AJAX) result pages based on the selection of certain filters chosen by the user. The result page will have 12 results shown and if the user scrolls down, the page will lazy load (infinite scroll) additional results. So for example, with these filters: Filter A: Size Filter B: Color Filter 😄 Location We could potentially have a page for "Large, Blue, New York" results dynamically generated. My issue is that I want Google to potentially crawl and index all these variations, so that I can have a page that ranks for "Large Blue New York", another page that ranks for "Small Orange Miami" etc. However, I do not need all the products indexed--- just the page with the first set of dynamic results would be enough since the additional products would just be more of the same. In other words, I am trying to get these pages with filters applied indexed and not necessarily get every possible product indexed. Can anyone comment on the best way to Get Google to index all dynamic variations? The proper way of paginating pages? Thank you
Intermediate & Advanced SEO | | Digi12340 -
Google Is Indexing My Internal Search Results - What should i do?
Hello, We are using a CMS/E-Commerce platform which isn't really built with SEO in mind, this has led us to the following problem.... a large number of internal (product search) search result pages, which aren't "search engine friendly" or "user friendly", are being indexed by google and are driving traffic to the site, generating our client revenue. We want to remove these pages and stop them from being indexed, replacing them with static category pages - essentially moving the traffic from the search results to static pages. We feel this is necessary as our current situation is a short-term (accidental) win and later down the line as more pages become indexed we don't want to incur a penalty . We're hesitant to do a blanket de-indexation of all ?search results pages because we would lose revenue and traffic in the short term, while trying to improve the rankings of our optimised static pages. The idea is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages. Our main focus is to improve user experience and not have customers enter the site through unexpected pages. All thoughts or recommendations are welcome. Thanks
Intermediate & Advanced SEO | | iThinkMedia0 -
Google Sitelinks Search Box
For some reason, a search for our company name (“hometalk”) does not produce the search box in the results (even though we do have sitelinks). We are adding schema markup as outlined here, but we're not sure about: Will adding the code make the search bar appear (or at least increase the chances), or is it only going to change the functionality of the search box (to on-site search) for results that are already showing a search bar?
Intermediate & Advanced SEO | | YairSpolter0 -
How can I penalise my own site in an international search?
Perhaps penalise isn't the right word, but we have two ecommerce sites. One at .com and one at .com.au. For the com.au site we would like only that site to appear for our brand name search in google.com.au. For the .com site we would like only that site to appear for our brand name search in google.com. I've targeted each site in the respective country in Google Webmaster Tools and published the Australian and English address on the respective site. What I'm concerned about is people on Google.com.au searching our brand and clicking through to the .com site. Is there anything I can do to lower the ranking of my .com site in Google.com.au?
Intermediate & Advanced SEO | | Benj250 -
Block search engines from URLs created by internal search engine?
Hey guys, I've got a question for you all that I've been pondering for a few days now. I'm currently doing an SEO Technical Audit for a large scale directory. One major issue that they are having is that their internal search system (Directory Search) will create a new URL everytime a search query is entered by the user. This creates huge amounts of duplication on the website. I'm wondering if it would be best to block search engines from crawling these URLs entirely with Robots.txt? What do you guys think? Bearing in mind there are probably thousands of these pages already in the Google index? Thanks Kim
Intermediate & Advanced SEO | | Voonie0