Deferred javascript loading
-
Hi! This follows on from my last question.
I'm trying to improve the page load speed for http://www.gear-zone.co.uk/.
Currently, Google rate the page speed of the GZ site at 91/100 – with the javascript being the only place where points are being deducated. The only problem is, the JS relates to the trustpilot widget, and social links at the bottom of the page – neither of which work when they are deferred.
Normally, we would add the defer attribute to the script tags, but by doing so it waits until the page is fully loaded before executing the scripts. As both the js I mentioned (reviews and buttons) use the document.Write command, adding this would write the code off the page and out of placement from where they should be.
Anyone have any ideas?
-
Ive run your site through the Page Speed tool here and you get 94/100 (which is awesome!).
No idea on the JS sorry
I'd be more than happy with 94/100!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript and SEO
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language? What I've gotten so far: Javascript can block search engine bots from fully rendering your website. If bots are unable to render your website, it may not be able to see important content and discount these content from their index. To know if bots could render your site, check the following: Google Search Console Fetch and Render Turn off Javascript on your browser and see if there are any site elements shown or did some disappear Use an online tool Technical SEO Fetch and Render Screaming Frog's Rendered Page GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???) Using our own site as an example, I ran our site through all the tests listed above. Results: Google Search Console: Rendered only the header image and text. Anything below wasn't rendered. The resources googlebot couldn't reach include Google Ad Services, Facebook, Twitter, Our Call Tracker and Sumo. All "Low" or blank severity. Turn off Javascript: Shows only the logo and navigation menu. Anything below didn't render/appear. Technical SEO Fetch and Render: Our page rendered fully on Googlebot and Googlebot Mobile. Screaming Frog: The Rendered Page tab is blank. It says 'No Data'. GTMetrix Results: Defer parsing of JavaScript was recommended. From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
Intermediate & Advanced SEO | | nhhernandez1 -
Lazy Loading of Blog Posts and Crawl Depths
Hi Moz Fans, We are looking at our blog and improving the content as much as we can for SEO purposes, but we have hit a bit of a blank in terms of lazy loading implications and issues with crawl depths. We introduced lazy loading onto the blog home page to increase site speed initially and it works well with infinite scroll, but we were wondering whether this would cause any issues regarding SEO. A lot of the resources online seem to be conflicting and some are very outdated, so some clarification on what is best in terms of lazy loading and crawl depths for blogs, would be fantastic! I hope someone can help and give us some up to date insights - If you need anymore information, I'll reply ASAP
Intermediate & Advanced SEO | | Victoria_0 -
Javascript to fetch page title for every webpage, is it good?
We have a zend framework that is complex to program if you ask me, and since we have 20k+ pages that we need to get proper titles to and meta descriptions, i need to ask if we use Javascript to handle page titles (basically the previously programming team had NOT set page titles at all) and i need to get proper page titles from a h1 tag within the page. current course of action which we can easily implement is fetch page title from that h1 tag being used throughout all pages with the help of javascript, But this does makes it difficult for engines to actually read what's the page title? since its being fetched with javascript code that we have put in, though i had doubts, is anyone one of you have simiilar situation before? if yes i need some help! Update: I tried the JavaScript way and here is what it looks like http://islamicencyclopedia.org/public/index/hadith/id/1/book_id/106 i know the fact that google won't read JavaScript like the way we have done with the website, But i need help on "How we can work around this issue" Knowing we don't have other options.
Intermediate & Advanced SEO | | SmartStartMediacom0 -
Enormous 7 page drop after switching servers and adding load balancers. Thoughts?
Hello Everyone, I'm a longtime Moz user but I had to switch accounts after switching jobs. I was hoping someone might be able to give me some insight on whats going on if possible. Our startup had first page position for our most valuable keyword: "Crowdfunding real estate" for about 6 or 7 months. Once we launched and switched to a production server behind load balancers, we dropped almost overnight to 7th page and we've been there for about a month. We don't have many links yet and some of the ones we DO have are kind of spammy (no idea where they came from and in process of trying to get them removed) but we thought it'd be strange to see that massive drop. We are even pages below a competitor who has NO links and basically zero content on the page. We don't have any notifications in WMT about a manual penalty or anything. I'd really, really appreciate any advice and If anyone has any ideas, the page is at: PatchofLand.com Thanks, Jason
Intermediate & Advanced SEO | | PatchofLand0 -
Varying Internal Link Anchor Text with Each New Page Load
I'm asking for people's opinions on varying internal anchor text. Before you jump in and say, "Oh yes, varying your anchor text is always a good idea", let me explain. I'm not talking about varying anchor text on different links scattered throughout a site. We all know that is a wise thing to do for a variety of reasons that have been covered in many places. What I'm talking about is including semi-useful links below the fold and then varying the anchor text with each page load. Each time Googlebot crawls a page, it sees different anchor text for each link. That way, Googlebot is seeing, for example, 'san diego bars', 'taverns in san diego', 'san diego clubs', and 'pubs in san diego' all pointing to a San Diego bar/tavern/club/pub page. I'm wondering if there is value in this approach. Will it help a site rank well for multiple search queries? Could it potentially be better than static anchor text as it may help Google better understand the targeted page? Is it a good way to protect a large site with a huge number of internal links from Penguin? To summarize, we're talking about the impact of varying the anchor text on a single page with each page load as opposed to varying the anchor text on different pages. Thoughts?
Intermediate & Advanced SEO | | RyanOD0 -
404 with a Javascript Redirect to the index page...
I have a client that is wanting me to issue a 404 on her links that are no longer valid to a custom 404, pause for 10 seconds, then rediirect to the root page (or whatever other redirect logic she wants)...to me it seems trying to game googlebot this way is a "bad idea" Can anyone confirm/deny or offer up a better suggestion?
Intermediate & Advanced SEO | | JusinDuff0 -
Static homepage content and javascript - is this technique obsolete?
Hi Apologies beforehand for any minor forum transgressions - this is my first post. I'm redesigning my blog and I have a question re static homepage content. It used to be common practice in the online gambling sector (and possibly others) to have a block of 'SEO copy' at the footer of the homepage. To 'trick Google' into thinking it was directly underneath the header, web devs would use javascript to instruct the html to load the div with the SEO copy first. The logic was that this allowed for the prime real estate of the page to be used for conversion and sales, while still having a block of relevant copy to tell the spiders what the page was about, and to provide deep links into the site. I attended a seminar just over a year ago at which some notable SEOs said that Google had probably worked this one out but it was impossible to tell. However, I've recently noticed that Everest Poker has what I think is the code commented out, and on PokerStars I can't find it at all (even in the includes). I would be happy to post the Everest code but, while I've read the etiquette, I'm not 100% whether this is allowed. So my question is... for the blog I'm redesigning, do I still need to follow this practice? I would prefer search engines saw some static intro text describing the site, rather than the blog posts, the excerpts of which will probably be canonicalized to the actual post pages to avoid duplication issues. But I would prefer this static content to appear below the fold. What is current best practice here? Alex
Intermediate & Advanced SEO | | alextanner0 -
Can Javascript be SEO friendly?
Is some Javascript SEO friendly? I know that Google Webmaster Guidelines states you should avoid the use of Javascript, (http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769), but does any one know if Google can read some Javascript or generally not?
Intermediate & Advanced SEO | | nicole.healthline0