How can I Improve Loading Speed? - Parker Dubrule Lawyers
-
Parker Dubrule Lawyers' website at parkerdubrulelawyers.com seemed to be loading quite slow this morning (>5 Seconds). I added a lazyload plugin, minified JS and CSS, and ensured that the images were optimized---all of this seemed to help and brough it down to under 2 seconds. We are looking at more reliable hosting options for our clients---ones that are inherently faster possibly without these plugins being added to the mix. Does anyone have insight on a safe, secure, and fast hosting/server option to enhance the experience from the get go? All of the websites that we build are in Worpress.
Your help is much appreciated! Thanks!
-
Thank you for such a detailed response!
-
Thank you for your insight!
-
Hosting & DNS
It looks like the DNS response is all over the place. Sometimes it's acceptable... at ~100ms, other times not so much. A better DNS provider would be worth looking into. Amazon Route 53 or Dyn are pretty good options.
For shared hosting, I can second SiteGround. It's a solid host for lower budgets. DigitalOcean is a very solid and inexpensive VPS, but there will be less hand holding. I plan on migrating to DO in the next week or so. My current host just removed sudo privileges from their VPS accounts. I know, right!?!
Sweet, sweet PHP 7 and Redis - here I come.
Things You Can Fix Immediately
Run the site through Pagespeed Insights. Make a punch list and go from there. There's also a download link for 'optimized' resources. Usually I only take images from that. More on that later.
One of the big ones is 'Remove Render Blocking JavaScript. The quick fix is moving Web Font Loader script. and the GA script to the footer. You're halfway there, in a lot of instances.
Images
A couple of slider images are still over 200KB. If there's anything you can do to reduce that, do so. The Pagespeed Insights tool states that the home page has a couple images that could be compressed further. Even though the savings are minuscule, it adds up over requests.Fonts
One of the better performance increases can be had with fonts. Again, move your Web Font Loader script to the footer. Consider using fewer character sets. Do you really need greek-ext, cyrillic or vietnamese character sets? If not, remove them.
Another fun one is using the preconnect tag. Here's a practical guide to web font performance from an author at CSS-Tricks. Just make sure to use fonts.gstatic, instead of fonts.typonine in the code snippet. Here's a fairly detailed reason why you want to use preconnect, from Ilya Grigorik. (Seriously, follow that guy if you're not already doing so.)
CSS, HTML & JavaScript
The site appears to have unminified CSS inlined in the head of the document. That's a lot of CSS, and it probably isn't all critical path. In order to render a page as fast as possible, you need to display visible content first. Drop your various style sheets into the critical path CSS generator.
You can input your newly generated critical path CSS into Autoptimize. It's a very handy plugin that minifies HTML, CSS and JavaScript. It will also combine your CSS & JavaScript to reduce requests. Another handy feature includes, you guessed it, Inline Critical Path CSS.
You will likely have to remove BWP Minify, as Autoptimize handles most - if not all - of those functions. More is not better, in this instance. In fact, you should disable any caching plugin options which handle minification.
Gzip Compression & Cache Expiration
It looks like cache expiration settings aren't setup for some basic MIME types (CSS, JPEG, etc.) Consider setting up a caching plugin, such as Super Cache or Total Cache. Failing that, this is one of the better htaccess settings repos.
***Edit: One of the issues involves query strings in static resources. Here's a good resource, with a few options to handle that.
As always make these changes in a test environment. And best of luck. You'll probably be happier, with a lot of projects.
-
I use iClickAndHost, SiteGround as shared hosting. I also use Amazon EC2, Linode, DigitalOcean, Vultr as VPS. For CDN - Amazon CloudFront and S3. Everything works perfect.
But you should diagnose your hosting issues before considering switching them. Can be something temporary - DDoS, hardware failure, network overload, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Video & Graph That Lazy Loads
Hi, Product pages on our site have a couple of elements that are lazy loaded / loaded after user action. Apart from images which is a widely discussed topic in lazy loading, in our case Videos & Price Graphs are lazy loaded. For videos we do something that Amit Agarwal recommended here: http://labnol.org/internet/light-youtube-embeds/27941/ - We load a thumbnail and a play button over it. When a user clicks that play button, the video embedd form Youtube would load. However we are not sure if Google gets that and since the whole thing is under a H3 tag, will we a) loose out benefit of putting a relevant video there b) send any negative signals for only loading a image thumbnail under an h3 tag? We also have price graph, that lazy loads and is not seen when you see a cached version of our page on Google. Are we losing credit (in Google's eyes) for that content on our page? Sample page which has both price history graph & video http://pricebaba.com/mobile/apple-iphone-6s-16gb Appreciate your help! Thanks
Technical SEO | | Maratha0 -
Can I Block https URLs using Host directive in robots.txt?
Hello Moz Community, Recently, I have found that Google bots has started crawling HTTPs urls of my website which is increasing the number of duplicate pages at our website. Instead of creating a separate robots.txt file for https version of my website, can I use Host directive in the robots.txt to suggest Google bots which is the original version of the website. Host: http://www.example.com I was wondering if this method will work and suggest Google bots that HTTPs URLs are the mirror of this website. Thanks for all of the great responses! Regards,
Technical SEO | | TJC.co.uk
Ramendra0 -
How i can remove 404 redirect error (Wordpress)
Hello ,
Technical SEO | | mayankebabu
I am getting 404 error in some pages of my wordpress site http://engineerbabu.com/ .
Those pages are permanently removed. Is there any plugin to fix this prob or anyway so that google will not crawl these pages.0 -
Can Googlebot read the content on our homepage?
Just for fun I ran our homepage through this tool: http://www.webmaster-toolkit.com/search-engine-simulator.shtml This spider seems to detect little to no content on our homepage. Interior pages seem to be just fine. I think this tool is pretty old. Does anyone here have a take on whether or not it is reliable? Should I just ignore the fact that it can't seem to spider our home page? Thanks!
Technical SEO | | danatanseo0 -
What can i do to get google to visit my site more often
Hi, i am having serious problems since i upgraded my website from joomla 1.5 to 3.0 We have dropped down the rankings from page one for the word lifestyle magazine, and we have dropped down in rankings for other very important words including gastric band hypnotherapy and i am starting to regret having the site upgraded. i am finding the google is taking its time visiting my site, i know this for two reasons, one i have checked the cache and it is showing the 2nd july and i have checked articles that we have written and they are still not showing. example if i put this article name in word for word it does not come up, Carnival Divert Ships In The Caribbean Due To bad Weather this was an article that was done yesterday. in the old days before the upgrade that would have been in google now. these problems are costing us the loss of a great deal of traffic, we are losing around 70% of our traffic since the upgrade and would be grateful if people could give me advice on how to turn things around. we add articles all the time. each day we add a number of articles, i was considering changing the front page in the middle and having a few paragraphs of the latest story to get google to visit more often. i know this would look messy but i am running out of ideas. any help would be great
Technical SEO | | ClaireH-1848860 -
How does Progressive Loading, aka what Facebook does, impact proper search indexation?
My client is planning on integrating progressive loading into their main product level pages (those pages most important to conversions and revenue). I am not skilled on "progressive laoding" but was told this is what Facebook does. Currently, the site's pages are tabbed and use Ajax. Is there any negative impact by changing this up by including progressive loading? If anyone can help me understand what this is and how it might impact a site from an SEO perspective, please let me know. thanks a ton!! Janet
Technical SEO | | ACNINTERACTIVE1 -
Can I Get Penalized for 301 Redirects (Too Many or In Any Scenario)?
A client of ours owns several domain names that are keyword similar to the domain they actually use to run their site. They are asking us if we should 301 redirect all of these websites to the domain they use. However, I don't want this to work against them and their site get penalized later for this. I have heard buying out competitors and redirecting their domain to yours is frowned upon and penalized when you get caught (they did not do this). We are also wondering if there is a limit as to how many domains you can 301 redirect and what type (keyword similar, misspellings, .net's, etc.) and if you are penalized after too many (i.e. >50). All of the domains in question are keyword/brand name similar only and do not exist as actual websites. We just want to do the right thing. Thank you for your help.
Technical SEO | | JCunningham0 -
Has anyone used paid services to help improve their site
Hi, i am getting lots of spam in my mail box about how companies can help you get more traffic and i see on lots of sites about tools that can bring you more traffic and help improve your site, and i am just wondering if anyone has tried any of these services or products to help promote their site. For example, i keep getting sent about submitting my site to over 200 directories or search engines and just wondering if these are a waste of time.
Technical SEO | | ClaireH-1848860