Can someone please help with technical question!
-
I have noticed that our website tool to get a quote does not work with active scripting disabled is this bad?
How many people have this disabled?
-
Hi Bob,
I want to apologize for the misinformation I gave you. I misread your question and apologize for that.
Paul,
I apologize if I miss led Bob. I misread it, and all I can do at this point is apologize.
I put the work in because I read the question I thought they were speaking of something else. So that's why I spent the time to Try and answer it. I've spoken to Keri privately and while I can't say I will never misinterpret something again ever I am definitely doing things to prevent that from happening.
All the best,
Thomas
-
Thomas, why put so much work into another answer that doesn't have anything to do with the original question? And could actually serve to badly confuse the original poster instead?
-
Hey Bob, you're in luck. Your own Government Digital Services ministry has done a test to provide you the exact answer to your question for UK audiences. According to their recent experiment in October 2013, approx 1.1% of visitors to the GOV.UK home page were missing out on JavaScript enhancement.
Of particular interest is the fact that some users didn't get the JavaScript even though they didn't actually have it disabled - as a result of slow connections, network or browser errors etc.
As the article mentions, this percentage can vary depending on the type of target user, but it's a good general yardstick. It also tallies well with a similar study done by Yahoo in 2010 where the figure was 1.3% for UK users.
That said though - you absolutely don't want the primary conversion mechanism on your site to be entirely dependent on active scripting. Done properly, the form should work at a basic level even without JavaScript, with additional functionality provided for those with scripting enabled (form validation, etc.). This is the "progressive enhancement" to which the above article refers.
Hope that answers you question?
Paul
-
Hi Keri,
Thank you for bringing that my attention
OP,
In terms of numbers of how many people actually have it disabled on their browser I think that's extremely hard to find out. I linked to buildwith.com which can tell you how many of the top million sites need JavaScript to operate correctly.
http://trends.builtwith.com/javascript
I looked around for a number, and I really don't know I believe because chrome is the biggest browser right now on the web, and it is recommended to be on that many people are using active JavaScript.
http://mashable.com/2012/05/21/chrome-is-tops/
While without a doubt there is a risk by using active JavaScript you open up your computer to attacks rather it's Mac PC, Linux whatever.
Here's some information on why you might want to turn it off.
http://nakedsecurity.sophos.com/2012/08/30/how-turn-off-java-browser/
this link as well as information even though it says enabling on the dangers.
http://www.alanwood.net/demos/enabling-javascript.html
One thing if you're worried about this I would recommend is installing either a tool that blocks sites known to exploit malware rather be from JavaScript or in other forms.
I personally use http://dyn.com/labs/dyn-internet-guide/
You can use
http://www.neustar.biz/enterprise/dns-services/recursive-dns-faqs
it is simply a matter of changing your reclusive DNS name servers while
They use their algorithms along with barracuda's malware technology to show you a this is a Site that is known for malware/spyware screen if you go to a bad site or a site tries to redirect you it will stop them.
I would use name bench a Google product that will tell you which reclusive name servers are the quickest for whatever region of the world you're in.
After which I would choose a service that blocks these types of attacks for instance Google's DNS will not be of any assistance in this manner so you might have the fastest download speed using them, but you do not want to change your DNS servers over to them because you will not get the benefit of blocking malware/spyware.
he only 2 that I can tell you I have used with success are OpenDNS & Dyn
a better explanation for exactly how to set up on your computer is contained in this link from OpenDNS
http://use.opendns.com/ Better instructions in the URL before this word.
http://dyn.com/labs/dyn-internet-guide/
However, rather you use Dyn or OpenDNS the set up is identical so follow those instructions in the link above to set your computer up that way. You can also of course set up through the router that's what I prefer that way everything is protected on your network.
Dyn Setup For DNS Veterans
Replace your current DNS resolvers with the following:
resolver1.dyndnsinternetguide.com – 216.146.35.35 resolver2.dyndnsinternetguide.com – 216.146.36.36
the URL for open DNS and the instructions on how to set it up are here
- 208.67.222.222
- 208.67.220.220
OpenDNS
another excellent system that does the same things as the others and has a very good way of showing you how to implement reclusive DNS is
http://www.neustar.biz/enterprise/dns-services/free-recursive-dns
Neustar DNS Advantage addresses,
156.154.70.1
156.154.71.1I would always recommend that no matter what system you run on your use an antivirus program. For instance I use Macs some people say that there wasting time and money using an antivirus I do not agree with that and would recommend a standard antivirus program for your computer no matter what type of computer you're running. However to prevent most of the JavaScript errors you can do a lot on the network side with DNS setups like the ones I've talked about.
one last thing Akamai CDN has an issue with any cast reclusive DNS servers meaning it's slightly slower on some websites that use the highest version of Akamai's content delivery network. You can get by without noticing it if you have at least a cable connection. I just thought I would let you know.
Sincerely,
Thomas
-
The OP is actually asking how many other users have it disabled, not how to personally enable or disable it.
-
well without it you cannot do quite a few things. Here is an article talking about how to enable it if that's what you wish to do. In order to get your website tool to create quotes to work you should follow these instructions.
or check out
http://activatejavascript.org/en/
I have it enabled personally I don't know how many others do builtwith.com would be a great place to find out that type of information.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Thinking about redesigning site to reduce bounce rate - have a couple of questions
BACKGROUND Im looking at redesigning the website for a creative consultancy to improve the user experience. The website is mainly an image portfolio along with “press”, “our services”, “about us”, “contact” pages. I originally designed the website a few years ago and when we did, we wanted to make the image portfolio the most important feature. So we made it a full screen JS image slider with lazy loading of images, so that there are about 40 full screen images on the homepage that rotate. From a users point of view i still feel this is the best system as it very quickly allows them to browse the portfolio, which we looking for a creative consultancy is their UPS (unique selling point). The site has a very strong backlink profile compared to its competitors in the SERPS it has about 20-25% increase in PA and DA. But our site has been slipping down the rankings in recent years / months. From spot 1-2 to about spot 5-7. MY HYPOTHESIS I think that the reason the site may be dropping back in the SERP is that although its a very usable site, all its portfolio information is “too easy to find / view” and results in a user coming to our site, seeing everything they need to see, then bouncing back to the SERP. Our site has a bounce rate of 40-60%. Where as on competitors sites, their “portfolio” is a separate page off the homepage, so a users has to click through to a separate page, and even if they don't like the design content of the portfolio it doesn't get logged as a bounce. MY QUESTION Does bounce rate affect SERP ranking ? Could the sites SERP performance be improved by redesigning the site to put the portfolio on a separate page so a user would have to click through to it, if that would get the bounce rate down, would the site see a benefit even if people still clicked back to the SERP results eventually after seeing our portfolio, even though it wasn't a true 1 page bounce ? Dose time on site affect SERP ranking ? Is there a way i can see a competitor's bounce rate ? Would welcome any other thoughts inputs on this matter.
Web Design | | sl_pa0 -
Can anyone help me detect some SEO improvements onpage please...
Can anyone help me detect some SEO improvements onpage please... I have shortened the website URl so its not easily found when searched via search engines.. http://goo.gl/GlfMRl Please have a look and give me some tips. Thanks
Web Design | | Nettv0 -
Question #1: Does Google index https:// pages? I thought they didn't because....
generally the difference between https:// and http:// is that the s (stands for secure I think) is usually reserved for payment pages, and other similar types of pages that search engines aren't supposed to index. (like any page where private data is stored) My site that all of my questions are revolving around is built with Volusion (i'm used to wordpress) and I keep finding problems like this one. The site was hardcoded to have all MENU internal links (which was 90% of our internal links) lead to **https://**www.example.com/example-page/ instead of **http://**www.example.com/example-page/ To double check that this was causing a loss in Link Juice. I jumped over to OSE. Sure enough, the internal links were not being indexed, only the links that were manually created and set to NOT include the httpS:// were being indexed. So if OSE wasn't counting the links, and based on the general ideology behind secure http access, that would infer that no link juice is being passed... Right?? Thanks for your time. Screens are available if necessary, but the OSE has already been updated since then and the new internal links ARE STILL NOT being indexed. The problem is.. is this a volusion problem? Should I switch to Wordpress? here's the site URL (please excuse the design, it's pretty ugly considering how basic volusion is compared to wordpress) http://www.uncommonthread.com/
Web Design | | TylerAbernethy0 -
Can you use a base element and mod_rewrite to alleviate the need for absolute URLs?
This is a follow up question to Scott Parsons' question about using absolute versus relative URLs when linking internally. Andy King makes the statement that this can be done and that it saves additional space (which he claims then can improve page speed). Is this a true and accurate statement? Can using a base element and mod-rewrite alleviate the need for absolute URLs? I need to know before going off on a "change all of our relative URLs to absolutes" campaign. Thanks in advance! Dana
Web Design | | danatanseo0 -
How can we improve our e-commerce site architecture to help best preserve Page Authority?
Today I installed the SEOMoz toolbar for Firefox (very cool, highly recommended). I was comparing our site http://www.ccisolutions.com to this competitor: http://www.uniquesquared.com For the most part, the deeper I go in our site the more the page authority drops. We have a few exceptions where the page authority of a subcategory page is actually better than the cat. page one level up. In comparison, when I was looking at http://www.uniquesquared.com I noticed that their page authority stays at "21" on every single category page I visit. Are you seeing what I'm seeing? Is this potentially a problem with the tool bar or, is there something significantly different about their site architecture that allows them to maintain that PA across all category and sub category pages? Is there something fundamentally wrong with our (http://www.ccisolutions.com) site architecture? I understand that we have longer URLs, but this is an old store with a lot of SKUs, so we have decided not to remove the /category/ and /product/ from the URLs because the 301 redirects that would result wouldn't pass all of the authority they've built up over the years. Interested to know viewpoints on the site architecture and how it might be improved. Thanks!
Web Design | | danatanseo0 -
How can the Web site designer and the SEO strategist work together peacefully?
The organization I work for has decided to re-design or re-develop the existing company Web site. My part in this project is to come up with new features to add to the site, as well as making the site SEO-friendly (copywriting, link-building, keyword research, etc.). I don’t know a thing about Web site design, coding, format, etc., and I guess I will have to work with a designer on this project. How would I go about finding a Web site designer? Should they have some SEO knowledge? How much designer, coding and site structure knowledge should I have? And how do we not infringe on one another as we work together? (Sorry so many questions.)
Web Design | | Obie0 -
Can SEO Moz perform a full site crawl and provide a report showing all URLs within an existing domain?
We are conducting a site redesign and need to get an idea of all pages that are out there on our domain (in some report fashion). This would help for discovery and cleanup as we re-work the site and move to a new CMS. Thanks
Web Design | | DCondon0 -
Question about web site structure
Is there an SEO advantage for individual pages to be in sub folders vs not being in a folder? Of course site managemnt is easier with folders if you have 100;s of pages...clearly a shorter URL is easier for humans to naviagte. store.com/gadgets store.com/lasers vs. store.com/gadgets/lasers
Web Design | | johnshearer0