JS loading blocker
-
Is there a tool, or Chrome extension I can use to load a page, identify the .js on the page, 'uncheck' selected .js and load the page again to check loading correctly? Even better to be able to defer/move to the end of the file to test.
-
Thanks for checking in, Mick!
-
Sorry for the delay. I got sidetracked on another project and this client decided they would leave .js as is for the time being so I have not really tested. Initially I couldn't get the Chrome ext to do what I wanted and need to look at Firefox.
-
Hi Mick, did you find what you were looking for? We'd love an update. Thanks!
Christy
-
thanks. I'll give it a try and let you know.
-
Hey Mick,
I use Firebug there is a version for Chrome, but it was originally built for Firefox.
Full java-script debugging, breaking, conditional breaking, watching, step in, and profiling
Chrome Version Here: https://getfirebug.com/releases/lite/chrome/
Hope this helps,
Don
-
I´ve found this discussion about the same subject if you want to have a look
stackoverflow.com/questions/9698059/disable-single-javascript-file-with-addon-or-extensionSorry but i can´t help you more than this.
Good luck
-
thanks, that's quite handy but not what I need in this case. This tool seems to switch off .js for the whole page. I'm looking for something where I can cherry pick the .js on the page I want to block, or ideally move.
-
Hi,
You can find what you´re looking for https://chrome.google.com/webstore/detail/quick-javascript-switcher/geddoclleiomckbhadiaipdggiiccfje
Hope it helps you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Problems with Loading to a Subfolder?
A client has a single page app website that shows https://example.com/example when you visit https://example.com . I don't think this is a redirect; I think it's a URL rewrite. My questions: Is this setup common with single page apps? What are the SEO benefits or drawbacks of having a domain's homepage load, rewrite, or redirect to a subfolder?
Technical SEO | | Kevin_P0 -
Can you use multiple videos without sacrificing load times?
We're using a lot of videos on our new website (www.4com.co.uk), but our immediate discovery has been that this has a negative impact on load times. We use a third party (Vidyard) to host our videos but we also tried YouTube and didn't see any difference. I was wondering if there's a way of using multiple videos without seeing this load speed issue or whether we just need to go with a different approach. Thanks all, appreciate any guidance! Matt
Technical SEO | | MattWatts1 -
My Homepage Won't Load if Javascript is Disabled. Is this an SEO/Indexation issue?
Hi everyone, I'm working with a client who recently had their site redesigned. I'm just going through to do an initial audit to make sure everything looks good. Part of my initial indexation audit goes through questions about how the site functions when you disable, javascript, cookies, and/or css. I use the Web Developer extension for Chrome to do this. I know, more recently, people have said that content loaded by Javascript will be indexed. I just want to make sure it's not hurting my clients SEO. http://americasinstantsigns.com/ Is it as simple as looking at Google's Cached URL? The URL is definitely being indexed and when looking at the text-only version everything appears to be in order. This may be an outdated question, but I just want to be sure! Thank you so much!
Technical SEO | | ccox10 -
Removing CSS & JS Files from Index
Hi, Google has indexed a few .CSS and .JS files that belong to our WordPress plugins and themes. I had them blocked via robots, but realized this doesn't prevent indexation (and can likely hurt us since Google wants to access these files). I've since removed the robots instructions, submitted a removal request via Search Console, but want to make sure they don't come back. Is there a way to put a noindex tag within .CSS and .JS files? Or should I do something with .htaccess instead?
Technical SEO | | kirmeliux1 -
Check my website loading time
Kindly check my website loading time for the home page and deep pages. Do I need to make it fast or it is Okey? Website - brandstenmedia.com.au
Technical SEO | | Green.landon0 -
¿Seo issue with loading product images into an iframe?
Hi there, Recently, I modified the structure of my product page to load the images into an iframe, instead of using the img tag directly . The reason is because I wanteddd product videos(YouTube) to be shown in the same iframe. My question is: If the attributes of the images are correctly set, from a SEO perspective, Do you see any problem with that approach? I know Google bot wasn't very good crawling iframes in the past. Thanks a lot. Best regards.
Technical SEO | | footd0 -
Trading longer load time for greater link potential
How do you approach balancing load time with making a page linkworthy? For example, if you create a page with a lot of rich media like infographics, images, videos, and audio, this will probably increase page load time but it will be more likely to earn links.
Technical SEO | | Charlessipe0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0