Eliminate render blocking javascript and css recommendation?
-
Our site's last Red flag issue is the "eliminate render blocking javascript and css" message. I don't know how to do that, and while I'm not sure if I could spend hours/days cutting and pasting and guessing until I made progress, I'd rather not. Does anyone know of a plugin that will just do this? Or, if not, how much would it cost to get a web developer to do this?
Also, if there is not plugin (and it didn't look like there was when I looked) how long do you think this would take someone who knows what they are doing to complete.
The site is: www.kempruge.com
Thanks for any tips and/or suggestions,
Ruben
-
Yes, it's over a month, and for the most part of the month our page speed score was 66 ave for the 3 seconds. Now, with the adjustments I've made and switching to a new hosting company, we're at an 81 as of this morning. So, I guess if 3 seconds at a 66 isn't terrible, we'll probably be in an acceptable range following the improvements.
Either way, thanks so much for the industry stats and the article. It's easy to find "how to make your website faster" info, but MUCH more difficult to find an article that I can trust. Thanks for the tip!
Ruben
-
Hi Ruben,
That analytics data is over a month or so right? Just to make sure we are not talking about an unusually fast or slow day!
3 seconds is not too bad. It can depend a lot on the type of site you have or the industry. Check this for a recent rundown of stats by country/industry. Also check out this article for a good rundown of tactics to use in reducing load times.
I would look at doing some of the more easy fixes included in the above article (if you havent already) before you move to trying to adjust the script rendering issues, especially if you do not have an inhouse person that is comfortable doing it. If you have already done all of that, then really it is a matter of how much effort it will require to find someone to diagnose and make the needed changes to the site code versus how much load/rendering time that will shave off. Personally, I think it might not be worth it, but others may disagree
-
Thanks Lynn! Yes, they are from Google Page Speed Insights. Attached is our pagespeed times from GA. Unfortunately, I'm not sure if they're okay or not. I just don't know enough, other than, faster is usually better.
Your thoughts?
Thanks,
Ruben
-
Hi,
Are you getting this flag from google page speed insights? Render blocking scripts are basically scripts that are called in the beginning of the page (the head usually) but are not really used either for that page or for the content of that page that is immediately visible, so downloading them first delays the rendering of the page. Depending on the structure of your site/code, plugins used etc fixing this could be as simple as moving a couple of lines of code in the template or..... quite complicated indeed.
What are your page load times in google analytics looking like? I had a look at your page and it seemed to load pretty fast so I would check load times in GA and see if the problem is really as urgent as you think. The page speed insight tool will flag everything it sees, but sometimes it can give you kind of false positives and other times just be recommending things mechanically that are not a huge issue in the grand scale of things!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Parameter Setting Recommendation - Webmaster Tools, Breadcrumbs & 404s
Hi All, We use a parameter called "breadCrumb" to drive the breadcrumbs on our ecommerce product pages that are categorized in multiple places. For example, our "Blue Widget" product may have the following URLs: http://www.oursite.com/item3332/blue-widget
Intermediate & Advanced SEO | | Doug_G
http://www.oursite.com/item3332/blue-widget_?breadCrumb=BrandTree_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree1_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree2_ We use a canonical tag pointing back to the base product URL. The parameter only changes the breadcrumbs. Which of the following, if any, settings would you recommend for such a parameter in GWT: Does this parameter change page content seen by the user? Options: Yes/No
How does this parameter affect page content? Options: Narrows/Specifies/Other Currently, google decided to automatically assign the parameter as "Yes/Other/Let Googlebot Decide" without notifying us. We noticed a drop in rankings around the suspected time of the assignment. Lastly, we have a consistent flow of products that are discontinued that we 404. As a result of the breadcrumb parameter, our 404s increase significantly (one for each path). Would 800 404 crawl errors out of 18k products cause a penalty on a young site? We got an "Increase in '404' pages' email from GWT, shortly after our rankings seemed to drop. Thank you for any advice or suggestions! Doug0 -
Robots.txt - blocking JavaScript and CSS, best practice for Magento
Hi Mozzers, I'm looking for some feedback regarding best practices for setting up Robots.txt file in Magento. I'm concerned we are blocking bots from crawling essential information for page rank. My main concern comes with blocking JavaScript and CSS, are you supposed to block JavaScript and CSS or not? You can view our robots.txt file here Thanks, Blake
Intermediate & Advanced SEO | | LeapOfBelief0 -
Javascript onclick redirects / porn sites...
We noticed around 7 websites which with domains that were just recently registered (with privacy protection). They are using our website keywords/titles and brand name and the sites are mostly porn / junk sites. They don't link to our website directly but use a javascript onclick redirect which is why we think we aren't seeing them in our backlinks report. We've been in business for over 12 years and haven't come across sites like this before. We recently lost our first page rankings for a few of our highest converting key phrases and have been digging in to possible causes. Just wondering if these sites could be impacting our results, and how to figure out if there are more like this? Examples: nesat.net
Intermediate & Advanced SEO | | EileenCleary
flowmeterdirectory.biz
finnsat.net
dotsjobs.net0 -
CSS Hidden DIVs - not collapsable content. Amber light?
I'm in the planning stage of a new ecommerce page. To reduce duplication issues, my page will be static with 20% of the page compiled of dynamic fields. So when a user selects a size, or color, the dynamic fields are the only ones that change as the rest of the content is the same. I can keep a static URL and not worry about duplication issues. Focus can be on strengthening this single URL with rich schema, reviews, and backlinks. We're going to cache a default page so for crawlers, the dynamic field doesn't appear empty. My developer said they can cache the page with all the variants of the dynamic fields, and use hidden DIVs to hide them from the user. This way, the load speed can be high, and search engines might crawl those keywords too. I'm thinking about and going.."wait a minute, that's a good idea..but would a search engine think I am hidding content and give me a penalty?". The hidden content is relevant to the page and it only appears according to the drop down to make the user experience more "friendly". What do you think? Use hidden DIV or use javascript to not allow bots to crawl the hidden data at all?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
I have two sitemaps which partly duplicate - one is blocked by robots.txt but can't figure out why!
Hi, I've just found two sitemaps - one of them is .php and represents part of the site structure on the website. The second is a .txt file which lists every page on the website. The .txt file is blocked via robots exclusion protocol (which doesn't appear to be very logical as it's the only full sitemap). Any ideas why a developer might have done that?
Intermediate & Advanced SEO | | McTaggart0 -
Site audit and SEO consultation ...who do you recommend?
I am looking to have an SEO specialist to audit and do consultation on one of my sites. This website never received a penalty from Google but was hit algorithmically and I need to bring it back up strong on the serps. Who do you recommend from the "recommended list" from MOZ? Cheers 🙂
Intermediate & Advanced SEO | | mbulox0 -
Managing Large Regulated or Required Duplicate Content Blocks
We work with a number of pharmaceutical sites that under FDA regulation must include an "Important Safety Information" (ISI) content block on each page of the site. In many cases this duplicate content is not only provided on a specific ISI page, it is quite often longer than what would be considered the primary content of the page. At first blush a rel=canonical tag might appear to be a solution to signal search engines that there is a specific page for the ISI content and avoid being penalized, but the pages also contain original content that should be indexed as it has user benefit beyond the information contained within the ISI. Anyone else running into this challenge with regulated duplicate boiler plate and has developed a work around for handling duplicate content at the paragraph level and not the page level? One clever suggestion was to treat it as a graphic, however for a pharma site this would be a huge graphic.
Intermediate & Advanced SEO | | BlooFusion380 -
Does Blocking ICMP Requests Affect SEO?
All in the title really. One of our clients came up with errors with a server header check, so I pinged them and it times out. The hosting company have told them that it's because they're blocking ICMP requests and this doesn't affect SEO at all... but I know that sometimes pinging posts, etc... can be beneficial so is this correct? Thanks, Steve.
Intermediate & Advanced SEO | | SteveOllington0