Eliminate render blocking javascript and css recommendation?
-
Our site's last Red flag issue is the "eliminate render blocking javascript and css" message. I don't know how to do that, and while I'm not sure if I could spend hours/days cutting and pasting and guessing until I made progress, I'd rather not. Does anyone know of a plugin that will just do this? Or, if not, how much would it cost to get a web developer to do this?
Also, if there is not plugin (and it didn't look like there was when I looked) how long do you think this would take someone who knows what they are doing to complete.
The site is: www.kempruge.com
Thanks for any tips and/or suggestions,
Ruben
-
Yes, it's over a month, and for the most part of the month our page speed score was 66 ave for the 3 seconds. Now, with the adjustments I've made and switching to a new hosting company, we're at an 81 as of this morning. So, I guess if 3 seconds at a 66 isn't terrible, we'll probably be in an acceptable range following the improvements.
Either way, thanks so much for the industry stats and the article. It's easy to find "how to make your website faster" info, but MUCH more difficult to find an article that I can trust. Thanks for the tip!
Ruben
-
Hi Ruben,
That analytics data is over a month or so right? Just to make sure we are not talking about an unusually fast or slow day!
3 seconds is not too bad. It can depend a lot on the type of site you have or the industry. Check this for a recent rundown of stats by country/industry. Also check out this article for a good rundown of tactics to use in reducing load times.
I would look at doing some of the more easy fixes included in the above article (if you havent already) before you move to trying to adjust the script rendering issues, especially if you do not have an inhouse person that is comfortable doing it. If you have already done all of that, then really it is a matter of how much effort it will require to find someone to diagnose and make the needed changes to the site code versus how much load/rendering time that will shave off. Personally, I think it might not be worth it, but others may disagree
-
Thanks Lynn! Yes, they are from Google Page Speed Insights. Attached is our pagespeed times from GA. Unfortunately, I'm not sure if they're okay or not. I just don't know enough, other than, faster is usually better.
Your thoughts?
Thanks,
Ruben
-
Hi,
Are you getting this flag from google page speed insights? Render blocking scripts are basically scripts that are called in the beginning of the page (the head usually) but are not really used either for that page or for the content of that page that is immediately visible, so downloading them first delays the rendering of the page. Depending on the structure of your site/code, plugins used etc fixing this could be as simple as moving a couple of lines of code in the template or..... quite complicated indeed.
What are your page load times in google analytics looking like? I had a look at your page and it seemed to load pretty fast so I would check load times in GA and see if the problem is really as urgent as you think. The page speed insight tool will flag everything it sees, but sometimes it can give you kind of false positives and other times just be recommending things mechanically that are not a huge issue in the grand scale of things!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages blocked by robots
**yazılım sürecinde yapılan bir yanlışlıktı.** Sorunu hızlı bir şekilde nasıl çözebilirim? bana yardım et. ```[XTRjH](https://imgur.com/a/XTRjH)
Intermediate & Advanced SEO | | mihoreis0 -
JavaScript navigation causing an SEO problem?
Hi - I'm looking at a site using JavaScript dropdown navigation - Google can crawl the whole site but my thinking is this - If I ensure the dropdown navigation is functioning fully when JS is switched off, I may facilitate the search engine bots? At the moment I can't get any dropdown effect if I turn JS off on the site but if I look at a cached page (text version) the dropdown links are visible and working. I am wondering whether any crawl benefit is there if you take this a step further and ensure the drop downs are actually visible and working when JS is switched off? I would welcome your thoughts on this. Thanks in advance, Luke - 07966 729775
Intermediate & Advanced SEO | | McTaggart0 -
Will two navigation components (one removed by Javascript) impact Google rankings?
We are trying to eliminate tedium when developing complexly designed responsive navigations for mobile, desktop and tablet. The changes between breakpoints in our designs are too complex to be handled with css, so we are literally grabbing individual elements with javascript and moving them around. What we'd like to do instead is have two different navigations on the page, and toggle which one is on the DOM based on breakpoint. These navigations will have the same links but different markup. Will having two navigation components on the page at page load negatively impact our Google SEO rankings or potential to rank, even if we are removing one or the other from the DOM with JavaScript?
Intermediate & Advanced SEO | | CaddisInteractive0 -
Not sure how we're blocking homepage in robots.txt; meta description not shown
Hi folks! We had a question come in from a client who needs assistance with their robots.txt file. Metadata for their homepage and select other pages isn't appearing in SERPs. Instead they get the usual message "A description for this result is not available because of this site's robots.txt – learn more". At first glance, we're not seeing the homepage or these other pages as being blocked by their robots.txt file: http://www.t2tea.com/robots.txt. Does anyone see what we can't? Any thoughts are massively appreciated! P.S. They used wildcards to ensure the rules were applied for all locale subdirectories, e.g. /en/au/, /en/us/, etc.
Intermediate & Advanced SEO | | SearchDeploy0 -
Blocked from google
Hi, i used to get a lot of trafic from google but sudantly there was a problem with the website and it seams to be blocked. We are also in the middle of changing the root domain because we are making a new webpage, i have looked at the webmaster tools and corrected al the errors but the page is still not visible in google. I have also orderd a new crawl. Anyone have any trics? do i loose a lot when i move the domainname, or is this a good thing in this mater? The old one is smakenavitalia.no The new one is Marthecarrara.no Best regards Svein Økland
Intermediate & Advanced SEO | | sveinokl0 -
Should I block wordpress archive and tag?
I use Wodpress and Wordpress SEO by Yoast. I've set ip up to add noindex meta tag on all archive and tag pages. I don't think its useful to include thoses pages in search results because there's quite a few. Especialy the tag archive. Should I consider anything else or change my mind? What do you think? Thanks
Intermediate & Advanced SEO | | Akeif0 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0 -
Robots.txt is blocking Wordpress Pages from Googlebot?
I have a robots.txt file on my server, which I did not develop, it was done by the web designer at the company before me. Then there is a word press plugin that generates a robots.txt file. How Do I unblock all the wordpress pages from googlebot?
Intermediate & Advanced SEO | | ENSO0