Http to https question (SSL)
-
Hi,
I recently made two big changes to a site - www.aerlawgroup.com (not smart, I know). First, I changed from Weebly to Wordpress (WP Engine hosting with CDN + Cloudflare - is that overkill?) and I added SSL (http to https). From a technical perspective, I think I made a better site: (1) blazing fast, (2) mobile responsive, (3) more secure.
I'm seeing the rankings fluctuate quite a bit, especially on the important keywords. I added SSL to my other sites, and saw no rankings change (they actually all went up slightly).
I'm wondering if anyone has had experience going to SSL and can give me feedback on something I might have overlooked. Again, it's strange that all the other sites responded positively, but the one listed above is going in the opposite direction. Maybe there are other problems, and the SSL is just a coincidence. Any feedback would be appreciated.
I followed this guide: http://moz.com/blog/seo-tips-https-ssl - which helped tremendously (FYI).
-
I'm also a big fan of changing the complete domain to HTTPS. Therefore I'm using HSTS response header to enforce this. The great advantage is that the browsers remember that site as HTTPS and skips any redirect you may have to make from HTTP to HTTPS. So might worth looking at this as well. We are using KeyCDN with force SSL feature enabled.
-
It's a bit overkill, but if you want to get rid of something, you can get rid of wp engine. I have a lot of websites running on cheap $5 hosts + cloudflare and once everything is cached, they are blazing fast.
Regarding the rankings, as Cyrus said, depending on the niche you'll see fluctuations, i have a website where i see movement in the serps every day or every other day.
Website looks nice, clean and professional.
-
Likely a coincidence, or at least highly probably there are other circumstances at play.
If you changed platforms, changed content, links, architecture at all, if there have been any changes in the backlinks, if the competition has made changes (something you can't controll!) if Google has made algorithm changes - even specific to your vertical, then you are bound to see changes in rankings that might be hard to pinpoint or explain.
Attorneys, especially those in certain niches like DUI, are especially tough and prone to fluctuation. Might take some extra investigation on your part.
Regardless, the site looks good and fast. Nice work!
-
Cloudflare is good, particularly with SSL. If it works well (check fetch + render in webmaster tools) then I would keep it.
You shouldn't need W3 Total Cache with WP Engines own caching, so I wouldn't mess around with your site performance any more if it is all working fine. You have good speeds as it is.
-
Would you recommend getting rid of CloudFlare? With 27 requests and a 300kb file size, I just don't think I need it. Especially if it's potentially causing fetch errors.
-
Hi,
Thank you for the detailed response. Yeah, I wondered if a new site + new host (WP Engine) + Cloudflare + SSL all at the same time was just too much.
I use WPEngine, which includes MaxCDN. With that said, WPEngine doesn't allow W3 Total Cache.
Thanks again for the feedback. I appreciate it.
-
Hi,
No, it is not overkill to use a CDN with Cloudflare. For my own site, I used MaxCDN with Cloudflare Railgun with HTTPS. Cloudflare railgun (free with certain hosts) will cache the content that shouldn't be cached, so great for SSL.
Unfortunately, what I found was Cloudflare gave Google fetch errors for certain files, so now I just use Maxcdn plus I like my EV SSL certificate which doesn't work with Cloudflare (unless you have the $200 pm plan).
You may want to check out https://www.besthostnews.com/guide-to-w3-total-cache-settings-with-cloudflare/ as that guide will help optimize your site, although I think WP Engine has it's own caching system.
Looking at your site: http://tools.pingdom.com/fpt/#!/dzqaUq/https://www.aerlawgroup.com/ it looks very lightweight, with only 27 requests. That is about as good as it gets, especially with your very low page size (300kb). I personally think you will struggle to optimize the site more, as quite frankly... your site speed is excellent. Well done!
Regards
Jonathan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved I have a "click rate juice" question would like to know.
Hello I have a "click rate juice" question would like to know. For example. I created a noindex site for a few days event purposes. Using a random domain like this: event.example.com. Expecting 5000+ clicks per day. Is it possible to gain some traffic juice from this event website domain "example.com" to my other main site "main.com" but without exposing its URL. Thought about using 301 redirecting "example.com" to "main.com". But it will reveal the example-b.com to the general public if someone visits the domain "example.com". Also thought about using a canonical URL, but it would not be working because the event site is noindex. or it would not matter at all 🤔 Wondering if there is a thing like this to gain some traffic juice for another domain? Thanks
Intermediate & Advanced SEO | | Blueli0 -
Content Cannibalism Question with example
Hi, Since I love writing and I write a lot I always find myself worried about ruining for my self with Content Cannibalism. Yesterday, while looking to learn about diamonds I encountered a highly ranked website that has two pages ranking high on the first page simultaneously (4th and 5th) - I never noticed it before with Google. The term I googled was "vvs diamonds" and the two pages were: http://bit.ly/1N51HpQ and http://bit.ly/1JefWYS Two questions: 1. Does that happen often with Google (presenting two lines from the same site on first page)? 2. Would it be better practice for the writer to combine them? - creating a one more powerful page... Thanks
Intermediate & Advanced SEO | | BeytzNet1 -
Tricky 301 question
A friend has relaunched a website but his web guys (he didn't consult me!) didn't do any 301s and now traffic unsurprisingly has tanked. The old site and database no longer exists and there are now 2000+ 404's. Any ideas how to do the 301s from old urls to new product urls WITHOUT it being a massive manual job?
Intermediate & Advanced SEO | | AndyMacLean0 -
Ecommerce Internal Linking Questions
I am a bit confused at internal linking for ecommerce site. Is it wise to link say all "boots" term in the review section to the boots page? Zappos is doing this. Wouldn't this incur penguin penalty? Since all internal anchor to that page is "boots" ? Scroll down to the bottom and checkout their reviews: http://www.zappos.com/tony-lama-6071l Is this the wise way to go about doing internal linking? Thanks
Intermediate & Advanced SEO | | WayneRooney0 -
Use "If-Modified-Since HTTP header"
I´m working on a online brazilian marketplace ( looks like etsy in US) and we have a huge amount of pages... I´ve been studing a lot about that and I was wondering to use If-Modified-Since so Googlebot could check if the pages have been updated, and if it is not, there is no reason to get a new copy of them since it already has a current copy in the index. It uses a 304 status code, "and If a search engine crawler sees a web page status code of 304 it knows that web page has not been updated and does not need to be accessed again." Someone quoted before me**Since Google spiders billions of pages, there is no real need to use their resources or mine to look at a webpage that has not changed. For very large websites, the crawling process of search engine spiders can consume lots of bandwidth and result in extra cost and Googlebot could spend more time in pages actually changed or new stuff!**However, I´ve checked Amazon, Rakuten, Etsy and few others competitors and no one use it! I´d love to know what you folks think about it 🙂
Intermediate & Advanced SEO | | SeoMartin10 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
To subnav or NOT to subnav... that's my question.... :)
We are working on a new website that is golf related and wondering about whether or not we should set up a subnavigation dropdown menu from the main menu. For example: GOLF PACKAGES
Intermediate & Advanced SEO | | JamesO
>> 2 Round Packages
>> 3 Round Packages
>> 4 Round Packages
>> 5 Round Packages GOLF COURSES
>> North End Courses
>> Central Courses
>> South End Courses This would actually be very beneficial to our users from a usability standpoint, BUT what about from an SEO standpoint? Is diverting all the link juice to these inner pages from the main site navigation harmful? Should we just create a page for GOLF PACKAGES and break it down on that page?0