JS loading blocker
-
Is there a tool, or Chrome extension I can use to load a page, identify the .js on the page, 'uncheck' selected .js and load the page again to check loading correctly? Even better to be able to defer/move to the end of the file to test.
-
Thanks for checking in, Mick!
-
Sorry for the delay. I got sidetracked on another project and this client decided they would leave .js as is for the time being so I have not really tested. Initially I couldn't get the Chrome ext to do what I wanted and need to look at Firefox.
-
Hi Mick, did you find what you were looking for? We'd love an update. Thanks!
Christy
-
thanks. I'll give it a try and let you know.
-
Hey Mick,
I use Firebug there is a version for Chrome, but it was originally built for Firefox.
Full java-script debugging, breaking, conditional breaking, watching, step in, and profiling
Chrome Version Here: https://getfirebug.com/releases/lite/chrome/
Hope this helps,
Don
-
I´ve found this discussion about the same subject if you want to have a look
stackoverflow.com/questions/9698059/disable-single-javascript-file-with-addon-or-extensionSorry but i can´t help you more than this.
Good luck
-
thanks, that's quite handy but not what I need in this case. This tool seems to switch off .js for the whole page. I'm looking for something where I can cherry pick the .js on the page I want to block, or ideally move.
-
Hi,
You can find what you´re looking for https://chrome.google.com/webstore/detail/quick-javascript-switcher/geddoclleiomckbhadiaipdggiiccfje
Hope it helps you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I want to load my ecommerce site xml via CDN
Hello Experts. My ecommerce site - abcd.com
Technical SEO | | micey123
My ecommrece site sitemap abcd.com/sitemap.xml
My subdomain - xyz.abcd.com ( this is blank page but status is 200 which runs from cdn) My ecommerce site sitemap abcd.com/sitemap.xml contains only 1 link of subdomain sitemap- xyz.abcd.com/sitemap.xml
And this sitemap- xyz.abcd.com/sitemap.xml contains all category and product links of abcd.com So my query is :- Above configuration is okay? In search console I will add new property - xyz.abcd.com. and add sitemap xyz.abcd.com/sitemap.xml So Google will able to give errors for my website abcd.com Purpose - I want to run my xml sitemap from cdn that's why i have created subdomain like xyz.abcd.com Hope you understood my query. Thanks!0 -
Target load time on ecommerce websites in 2017
I have a client that is redeveloping their website in Magento and is interested to know what their target page load time should be. I've read some stuff on this that's over a year old and curious if anyone has a census on what the averages are or what we should aim for. I know the simple answer is "as fast as it can be", but wondering if anyone has additional insight. Thanks!
Technical SEO | | aedesignco0 -
How fast should a page load to get a Green light at Googles PageSpeed?
So, trying to get e big e-commerce site to work on their page loading issuses. Their question left me without an answer, so how fast should a site be, so that it will get a Green light at the Googles Page Speed test? Is there a number in seconds? Do we know that?
Technical SEO | | ziiiva1230 -
Loading images below the fold? Impact on SEO
I got this from my developers. Does anyone know if this will be a SEO issue? We hope to lazy-load images below the fold where possible, to increase render speed - are you aware of any potential issues with this approach from an SEO point of view?
Technical SEO | | KatherineWatierOng1 -
How to get only the most needed css for faster loading?
I have been using the Firefox duster app to clean up my css so only the page rendering css is loaded when my page is loaded. But it doesn't seem to be working now. Does anyone know of another tool that will do this for me?
Technical SEO | | RoxBrock0 -
Page Load Timings: How accurate is Google Analytics Data?
Hello Guys, what are your experiences? How accurate is google analytics data regarding page load times? I know that one of my sites has trouble with pageload times, especially in India and USA. We are based in middle Europe and regarding to the GA data we have here in middle europe of about 2 seconds page load time. Moreover we have of about 4 seconds in USA and 10 seconds in India. Therefore I decided to test for a few sides a CDN (on these pages all static files are served over the CDN). However, first GA data indicates, that the page load times are even getting worse!!! But when I test it for example with pingdom (http://tools.pingdom.com/fpt/) and compare it with an old landing page without CDN implementation, the tool says it's faster. The CDN provider (maxcdn) send me also some reports, which indicate, that the page load time should be faster...That's the reason why I ask about your experience with the GA page load time data, because personally I get the impression you cannot trust the data... Thanks for your help! Cheers
Technical SEO | | _Heiko_2 -
Late loading content via AJAX - impact to bots
Hi, In an attempt to reduce the latency of our site, we are planning on late-loading all content below the fold via AJAX. My concern is Googlebot won't see this content and won't index it properly (our site is very large and we have lots of content). What is good for our users is not necessarily good for bots. Will late loading AJAX content be read by Googlebot? Thoughts on how to balance speed vs search engine crawl-ability?
Technical SEO | | NicB10 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0