HTTP to HTTPS Transition, Large Drop in Search Traffic
-
My URL is: https://www.seattlecoffeegear.comWe implemented https across the site on Friday. Saturday and Sunday search traffic was normal/slightly higher than normal (in analytics) and slightly down in GWT. Today, it has dropped significantly in both, to about half of normal search traffic. From everything we can see, we implemented this correctly.
- 301 redirected all http requests to https (and yes, they go to the correct page and not to the homepage
)
- Rewrote hardcoded internal links
- Registered/submitted sitemaps from https in Bing and GWT
- Used fetch and render to ensure Google could reach the site and also was redirected appropriately from http to https versions
- Ensured robots.txt does not block https or secure
We also use a CDN (though I don't think that impacts anything) and have had no customer issues with accessing or using the website since the transition.Is there anything else I might be missing that could correlate to a drop in search impressions or is this just a waiting game of a few days to let Google sort through the change we've made and reindex everything (it dropped to 0 indexed for a day and is now up to 1744 of our 2180 pages indexed)?Thank you so much for any input!Kaylie
- 301 redirected all http requests to https (and yes, they go to the correct page and not to the homepage
-
Thank you for the reminder! I believe these have all been switched over, but I'll give it another look to be sure!
Kaylie
-
Important think to remember if you are running AdWords to your site is to make sure that your AdWords also point to the https url.
Also when you create new AdWords add pay attention, as default setting is http.
-
Hmmm, I don't appear to have that icon. Perhaps because sampling is not occurring on my report?
I actually felt that the transition went really well, which is why I was surprised by the data. I have a feeling, however, that I indeed just need to give it a few more days and keep checking the traffic/search information.
I'll keep you posted on how things pan out over the next week or so!
-
For Webmaster Tools, you are only getting partial data from the 27th right now. I'd monitor WT for the first full week post switch and compare your data then.
Also, for Google Analytics, that 4% difference may be due to having only partial data from today. So you may need to wait a bit longer for a more accurate comparison there. That said, 4% difference in traffic isn't alarming for the amount of time you're digging into. You're probably overly sensitive after the switch. I'm sure I'd be.
Once thing you can do in GA is ask for higher precision on the reporting. Click the weird icon next to the graduation cap in the top right and slide the toggle under "Control the number of sessions used to calculate this report" toward Higher Precision.
I'm really interested in your experience because we're mapping out a move to https, too.
-
Thanks for the response!
Sorry about the timeout error. Not sure what happened there either. I have been unable to replicate it.
You were partially right about the dates.
So now I am looking at the same time table for Analytics and GWT (21-27) and I see a huge "drop" on the 27th in GWT. However, if I extend GA through today, I see a minor drop on Friday the 26th and then it bumps back up to normal levels. Is there a chance that maybe GWT is reporting search/clicks from part way through the 27th rather than the full day of data, creating a false sense of alarm?
I wanted to see a larger set of data that might tell a more complete story. So, looking in GA, I only see a change of 5% in traffic when comparing the 23rd-29th with the 16th-22nd. That is a 4% decrease in Google organic, but I don't feel like these numbers are the cause for alarm that GWT graphs initially indicate. Thoughts?
Thanks for your input!
Kaylie
-
I get a 504 error when I first tried to load your site (https). Then I loaded the http version, but was redirected to the HTTPS version after some time. Not sure what's going on there. I could not repeat the error.
Also, Webmaster Tools data is about two days delayed. Are you sure you're comparing the same days in both applications? (Not uncommon mistake for me to have made.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO traffic to the homepage is down across sites
Week over week, I've noticed that organic traffic (and oftentimes revenue) for the homepage are down across most of our sites compared to last year. Brand search interest is down for a number of the brands, but in a lot of these cases, it's not down so much that it would make sense for how much the homepage is down (for example: brand search interest was down 4% last week compared to last year, but the homepage traffic was down 32% in visits). What I've done is generate entry page reports (this year vs. last year) and then bucket the pages by homepage, category pages, and product pages. In most cases, category pages are up year over year for traffic and revenue. I'm concerned that the homepage being down is more than a brand heat issue, but I haven't come across anything out of the ordinary in Google Search Console and keywords are pretty consistent in performance for the most part. Branded keywords continue to rank at #1, too. Any thoughts as to what else I can look into?
Technical SEO | | WWWSEO0 -
Why has my search traffic suddenly tanked?
On 6 June, Google search traffic to my Wordpress travel blog http://www.travelnasia.com tanked completely. There are no warnings or indicators in Webmaster Tools that suggest why this happened. Traffic from search has remained at zero since 6 June and shows no sign of recovering. Two things happened on or around 6 June. (1) I dropped my premium theme which was proving to be not mobile friendly and replaced it with the ColorMag theme which is responsive. (2) I relocated off my previous hosting service which was showing long server lag times to a faster host. Both of these should have improved my search performance, not tanked it. There were some problems with the relocation to the new web host which resulted in a lot of "out of memory" errors on the website for 3-4 days. The allowed memory was simply not enough for the complexity of the site and the volume of traffic. After a few days of trying to resolve these problems, I moved the site to another web host which allows more PHP memory and the site now appears reliably accessible for both desktop and mobile. But my search traffic has not recovered. I am wondering if in all of this I've done something that Google considers to be a cardinal sin and I can't see it. The clues I'm seeing include: Moz Pro was unable to crawl my site last Friday. It seems like every URL it tried to crawl was of the form http://www.travelnasia.com/wp-login.php?action=jetpack-sso&redirect_to=http://www.travelnasia.com/blog/bangkok-skytrain-bts-mrt-lines which resulted in a 500 status error. I don't know why this happened but I have disabled the Jetpack login function completely, just in case it's the problem. GWT tells me that some of my resource files are not accessible by GoogleBot due to my robots.txt file denying access to /wp-content/plugins/. I have removed this restriction after reading the latest advice from Yoast but I still can't get GWT to fetch and render my posts without some resource errors. On 6 June I see in Structured Data of GWT that "items" went from 319 to 1478 and "items with errors" went from 5 to 214. There seems to be a problem with both hatom and hcard microformats but when I look at the source code they seem to be OK. What I can see in GWT is that each hcard has a node called "n [n]" which is empty and Google is generating a warning about this. I see that this is because the author vcard URL class now says "url fn n" but I don't see why it says this or how to fix it. I also don't see that this would cause my search traffic to tank completely. I wonder if anyone can see something I'm missing on the site. Why would Google completely deny search traffic to my site all of a sudden without notifying any kind of penalty? Note that I have NOT changed the content of the site in any significant way. And even if I did, it's unlikely to result in a complete denial of traffic without some kind of warning.
Technical SEO | | Gavin.Atkinson1 -
How to handle pagination for a large website?
I am currently doing a site audit on a large website that just went through a redesign. When looking through their webmaster tools, they have about 3,000 duplicate Title Tags. This is due to the way their pagination is set up on their site. For example. domain.com/books-in-english?page=1 // domain.com/books-in-english?page=4 What is the best way to handle these? According to Google Webmaster Tools, a viable solution is to do nothing because Google is good at distinguishing these. That said, it seems like their could be a better solution to help prevent duplicate content issues. Any advice would be much welcomed. 🙂
Technical SEO | | J-Banz0 -
Http to https - is a '302 object moved' redirect losing me link juice?
Hi guys, I'm looking at a new site that's completely under https - when I look at the http variant it redirects to the https site with "302 object moved" within the code. I got this by loading the http and https variants into webmaster tools as separate sites, and then doing a 'fetch as google' across both. There is some traffic coming through the http option, and as people start linking to the new site I'm worried they'll link to the http variant, and the 302 redirect to the https site losing me ranking juice from that link. Is this a correct scenario, and if so, should I prioritise moving the 302 to a 301? Cheers, Jez
Technical SEO | | jez0000 -
Should I index my search result pages?
I have a job site and I am planning to introduce a search feature. The question I have is, is it a good idea to index search results even if the query parameters are not there? Example: A user searches for "marketing jobs in New York that pay more than 50000$". A random page will be generated like example.com/job-result/marketing-jobs-in-new-york-that-pay-more-than-50000/ For any search that gets executed, the same procedure would be followed. This would result in a large number of search result pages automatically set up for long tail keywords. Do you think this is a good idea? Or is it a bad idea based on all the recent Google algorithm updates?
Technical SEO | | jombay0 -
Homepage not showing for searches
Hi Looking for a bit advice our client - www.financial-wise.co.uk We worked with this client on his old website www.mortgage-wise.co.uk we had him ranking for most local searches. The client then re-branded the full company and got another company in to do this, they did him new website www.financial-wise.co.uk, the company then launched the new domain with the old one still showing, some of the content was the same, including homepage. Anyway the issue im having now is, for certain searches im struggling to get them ranking again and for searches such as financial wise, its inner pages showing instead of the homepage? I have che checked and homepage is indexed, i have also re-written all the text on the homepage but still having some issue, its almost like the homepage has been penalised any help would be great?
Technical SEO | | rfksolutionsltd0 -
How to handle large numbers of comments?
First the good news. One site that I've been working on has seen an increase in traffic from 2k/month to 80k! As well as lots of visitors, the site is also getting lots of comments with one page getting more than 70 comments/day and showing no sign of a slow down! Approximately 3000 comments in total and growing! What is the best approach for handling this? I'm not talking about the review/approval/response but just in the way these comments are presented on the website taking both seo and usability into account. Does anyone have any particular recommendations? Options I've considered are: Just show the most recent x comments and ignore the rest. (Nobody is going to read 3000 comments!) Paginate comments (risk of duplicate content? Using Ajax could hide long-tail phrases in comments?) Show all comments (page load speed is suffering and this is likely to be causing problems for mobile visitors) How do active comments on a page contribute to an article's freshness? Any thoughts would be greatly appreciated.
Technical SEO | | DougRoberts2 -
Should search pages be disallowed in robots.txt?
The SEOmoz crawler picks up "search" pages on a site as having duplicate page titles, which of course they do. Does that mean I should put a "Disallow: /search" tag in my robots.txt? When I put the URL's into Google, they aren't coming up in any SERPS, so I would assume everything's ok. I try to abide by the SEOmoz crawl errors as much as possible, that's why I'm asking. Any thoughts would be helpful. Thanks!
Technical SEO | | MichaelWeisbaum0