Deferred javascript loading
-
Hi! This follows on from my last question.
I'm trying to improve the page load speed for http://www.gear-zone.co.uk/.
Currently, Google rate the page speed of the GZ site at 91/100 – with the javascript being the only place where points are being deducated. The only problem is, the JS relates to the trustpilot widget, and social links at the bottom of the page – neither of which work when they are deferred.
Normally, we would add the defer attribute to the script tags, but by doing so it waits until the page is fully loaded before executing the scripts. As both the js I mentioned (reviews and buttons) use the document.Write command, adding this would write the code off the page and out of placement from where they should be.
Anyone have any ideas?
-
Ive run your site through the Page Speed tool here and you get 94/100 (which is awesome!).
No idea on the JS sorry
I'd be more than happy with 94/100!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Load Balancer issues on Search Console
The top linked domains in search console are coming from our load balancer setup. Does anyone know how to remove these as unique sites pointing back to our primary domain? I was told Google is smart enough to ignore these as duplicate domains but if that was the case, why would they be listed as the top linked domains in search console? Most concerned....
Intermediate & Advanced SEO | | DonFerrari21690 -
Javascript and SEO
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language? What I've gotten so far: Javascript can block search engine bots from fully rendering your website. If bots are unable to render your website, it may not be able to see important content and discount these content from their index. To know if bots could render your site, check the following: Google Search Console Fetch and Render Turn off Javascript on your browser and see if there are any site elements shown or did some disappear Use an online tool Technical SEO Fetch and Render Screaming Frog's Rendered Page GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???) Using our own site as an example, I ran our site through all the tests listed above. Results: Google Search Console: Rendered only the header image and text. Anything below wasn't rendered. The resources googlebot couldn't reach include Google Ad Services, Facebook, Twitter, Our Call Tracker and Sumo. All "Low" or blank severity. Turn off Javascript: Shows only the logo and navigation menu. Anything below didn't render/appear. Technical SEO Fetch and Render: Our page rendered fully on Googlebot and Googlebot Mobile. Screaming Frog: The Rendered Page tab is blank. It says 'No Data'. GTMetrix Results: Defer parsing of JavaScript was recommended. From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
Intermediate & Advanced SEO | | nhhernandez1 -
Redirection: Load balancer or CNAME?
We had a bunch of domains which no longer have sites tied to them but many have decent links pointing to them. In most cases we have other relevant content on live sites we can redirect these URL's to. We have been given the choice of redirection through the load balancer or direct as a cname on our CDN. I only have experience of 301's - What would be the preferred choice from an SEO perspective? Thanks, Sam
Intermediate & Advanced SEO | | Samsam00000 -
How good is Google at reading geo-targeted dynamic content -- Javascript?
We are using a single page application for a section of our website where it generates content based on the user's geographical location. Because Google's Search Console is searching from Virginia (where we don't have any content), we are not able to see anything render in Google Search Console. How good is Google at reading geo-targeted dynamic content? Do we have anything to worry about in terms of indexing the content because it's being served through JS?
Intermediate & Advanced SEO | | imjonny1230 -
How would Google reach internal pages on Zales with Lazy Load?
Hi, I encountered the following page on Zales:
Intermediate & Advanced SEO | | BeytzNet
http://engagementring.theprestigediamondcollection.com/NewEngagementRing/NewEring.aspx As you scroll down more items pop up (the well known Pinterest style).
Would Google bot be able to enter the product pages? I don't assume the bot "scrolls"... Thanks0 -
Dilemma: Should we use pagination or 'Load More' Function
In the interest of pleasing Google with their recent updates and clamping down on duplicate content and giving a higher preference to pages with rich data, we had a tiny dilemma that might help others too. We have a directory like site, very similar to Tripadvisor or Yelp, would it be best to: A) have paginated content with almost 40 pages deep of data < OR > B) display 20 results per page and at the bottom have "Load More" function which would feed more data only once its clicked. The problem we are having now is that deep pages are getting indexed and its doing us no good, most of the juice and page value is on the 1st one, not the inner pages. Wondering what are the schools of thought on this one. Thanks
Intermediate & Advanced SEO | | danialniazi0 -
Static homepage content and javascript - is this technique obsolete?
Hi Apologies beforehand for any minor forum transgressions - this is my first post. I'm redesigning my blog and I have a question re static homepage content. It used to be common practice in the online gambling sector (and possibly others) to have a block of 'SEO copy' at the footer of the homepage. To 'trick Google' into thinking it was directly underneath the header, web devs would use javascript to instruct the html to load the div with the SEO copy first. The logic was that this allowed for the prime real estate of the page to be used for conversion and sales, while still having a block of relevant copy to tell the spiders what the page was about, and to provide deep links into the site. I attended a seminar just over a year ago at which some notable SEOs said that Google had probably worked this one out but it was impossible to tell. However, I've recently noticed that Everest Poker has what I think is the code commented out, and on PokerStars I can't find it at all (even in the includes). I would be happy to post the Everest code but, while I've read the etiquette, I'm not 100% whether this is allowed. So my question is... for the blog I'm redesigning, do I still need to follow this practice? I would prefer search engines saw some static intro text describing the site, rather than the blog posts, the excerpts of which will probably be canonicalized to the actual post pages to avoid duplication issues. But I would prefer this static content to appear below the fold. What is current best practice here? Alex
Intermediate & Advanced SEO | | alextanner0