Can someone please help with technical question!
-
I have noticed that our website tool to get a quote does not work with active scripting disabled is this bad?
How many people have this disabled?
-
Hi Bob,
I want to apologize for the misinformation I gave you. I misread your question and apologize for that.
Paul,
I apologize if I miss led Bob. I misread it, and all I can do at this point is apologize.
I put the work in because I read the question I thought they were speaking of something else. So that's why I spent the time to Try and answer it. I've spoken to Keri privately and while I can't say I will never misinterpret something again ever I am definitely doing things to prevent that from happening.
All the best,
Thomas
-
Thomas, why put so much work into another answer that doesn't have anything to do with the original question? And could actually serve to badly confuse the original poster instead?
-
Hey Bob, you're in luck. Your own Government Digital Services ministry has done a test to provide you the exact answer to your question for UK audiences. According to their recent experiment in October 2013, approx 1.1% of visitors to the GOV.UK home page were missing out on JavaScript enhancement.
Of particular interest is the fact that some users didn't get the JavaScript even though they didn't actually have it disabled - as a result of slow connections, network or browser errors etc.
As the article mentions, this percentage can vary depending on the type of target user, but it's a good general yardstick. It also tallies well with a similar study done by Yahoo in 2010 where the figure was 1.3% for UK users.
That said though - you absolutely don't want the primary conversion mechanism on your site to be entirely dependent on active scripting. Done properly, the form should work at a basic level even without JavaScript, with additional functionality provided for those with scripting enabled (form validation, etc.). This is the "progressive enhancement" to which the above article refers.
Hope that answers you question?
Paul
-
Hi Keri,
Thank you for bringing that my attention
OP,
In terms of numbers of how many people actually have it disabled on their browser I think that's extremely hard to find out. I linked to buildwith.com which can tell you how many of the top million sites need JavaScript to operate correctly.
http://trends.builtwith.com/javascript
I looked around for a number, and I really don't know I believe because chrome is the biggest browser right now on the web, and it is recommended to be on that many people are using active JavaScript.
http://mashable.com/2012/05/21/chrome-is-tops/
While without a doubt there is a risk by using active JavaScript you open up your computer to attacks rather it's Mac PC, Linux whatever.
Here's some information on why you might want to turn it off.
http://nakedsecurity.sophos.com/2012/08/30/how-turn-off-java-browser/
this link as well as information even though it says enabling on the dangers.
http://www.alanwood.net/demos/enabling-javascript.html
One thing if you're worried about this I would recommend is installing either a tool that blocks sites known to exploit malware rather be from JavaScript or in other forms.
I personally use http://dyn.com/labs/dyn-internet-guide/
You can use
http://www.neustar.biz/enterprise/dns-services/recursive-dns-faqs
it is simply a matter of changing your reclusive DNS name servers while
They use their algorithms along with barracuda's malware technology to show you a this is a Site that is known for malware/spyware screen if you go to a bad site or a site tries to redirect you it will stop them.
I would use name bench a Google product that will tell you which reclusive name servers are the quickest for whatever region of the world you're in.
After which I would choose a service that blocks these types of attacks for instance Google's DNS will not be of any assistance in this manner so you might have the fastest download speed using them, but you do not want to change your DNS servers over to them because you will not get the benefit of blocking malware/spyware.
he only 2 that I can tell you I have used with success are OpenDNS & Dyn
a better explanation for exactly how to set up on your computer is contained in this link from OpenDNS
http://use.opendns.com/ Better instructions in the URL before this word.
http://dyn.com/labs/dyn-internet-guide/
However, rather you use Dyn or OpenDNS the set up is identical so follow those instructions in the link above to set your computer up that way. You can also of course set up through the router that's what I prefer that way everything is protected on your network.
Dyn Setup For DNS Veterans
Replace your current DNS resolvers with the following:
resolver1.dyndnsinternetguide.com – 216.146.35.35 resolver2.dyndnsinternetguide.com – 216.146.36.36
the URL for open DNS and the instructions on how to set it up are here
- 208.67.222.222
- 208.67.220.220
OpenDNS
another excellent system that does the same things as the others and has a very good way of showing you how to implement reclusive DNS is
http://www.neustar.biz/enterprise/dns-services/free-recursive-dns
Neustar DNS Advantage addresses,
156.154.70.1
156.154.71.1I would always recommend that no matter what system you run on your use an antivirus program. For instance I use Macs some people say that there wasting time and money using an antivirus I do not agree with that and would recommend a standard antivirus program for your computer no matter what type of computer you're running. However to prevent most of the JavaScript errors you can do a lot on the network side with DNS setups like the ones I've talked about.
one last thing Akamai CDN has an issue with any cast reclusive DNS servers meaning it's slightly slower on some websites that use the highest version of Akamai's content delivery network. You can get by without noticing it if you have at least a cable connection. I just thought I would let you know.
Sincerely,
Thomas
-
The OP is actually asking how many other users have it disabled, not how to personally enable or disable it.
-
well without it you cannot do quite a few things. Here is an article talking about how to enable it if that's what you wish to do. In order to get your website tool to create quotes to work you should follow these instructions.
or check out
http://activatejavascript.org/en/
I have it enabled personally I don't know how many others do builtwith.com would be a great place to find out that type of information.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
To delete or not? That is the question..
In the case of an eCommerce store with a large catalogue of branded goods the inventory is constantly being adjusted as products become discontinued. Each year most fashion brands have 2 or 3 launches. At this same time they will delete some (not all) of previous years collections. Once we have sold through the remaining inventory of last season's products the question is how to proceed? a) delete products to avoid customers landing on page, then only to be disappointed when product is no longer available to purchase.. b) keep products however mark as discontinued / no longer available and show a link to a similar product if applicable.. I am coming around to the opinion that b) provides a better user experience. However will this growing catalogue of old products (pushed to bottom of category page) help keep content of site full and have SEO advantages? If this is the case then that helps confirm b) as best choice??
Web Design | | seanmccauley0 -
Can the design still be considered adaptive if the URL is different?
I was under the impression our site had a mobile dedicated design, but my developers are telling me we have an adaptive design. The mobile site is set up different and has different content and the url is as follows: www.site.com/MobileView/MobileHome.aspx Can it still be considered adaptive if the URL is not the exact same? Hopefully this make sense and I appreciate anyone's input!
Web Design | | AliMac260 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Help with error: Not Found The requested URL /java/backlinker.php was not found on this server.
Hi all, We got this error for almost a month now. Until now we were outsourcing the webdesign and optimization, and now we are doing it in house, and the previous company did not gave us all the information we should know. And we've been trying to find this error and fix it with no result. Have you encounter this issue before? Did anyone found or knows a solution? Also would this affect our website in terms of SEO and in general. Would be very grateful to hear from you. Many thanks. Here is what appears on the bottom of the site( www.manvanlondon.co.uk) Not Found The requested URL /java/backlinker.php was not found on this server. <address>Apache/2.4.7 (Ubuntu) Server at 01adserver.com Port 80</address> <address> </address> <address> </address>
Web Design | | monicapopa0 -
URL Help
Will the following urls will be considered as two different urls? 1. www.example.com/key=value1& key2=value2 2. www.example.com/key2=value2 & key=value1
Web Design | | prsntsnh0 -
Pleas Help! My microdata format disappeared?
Hi guys im new on this comunity, reacently i have an ishue taht i do not know how to solve. Google suggested that you can put microdata on my website and its look very cool on the organic search pages. So whtat happend. I have change my website to a new one with all rich snippets and microdata, submited a xml.sitiemap dinamic one that refresh every time when i putt something new on my website. do a redirection from my old site to a new one. All that google propose. I have check all the pages in webmaster tools and in a that tool google show everything, also microdata have shown on a google organic search aproximetly 7 days when he refresh my pages from old site. After that microdata and little stars just desapired from organic serch on google. I do not know why? I did not do anything. So can anyone help me how to return little stars on my links on google shown on organic serch. website: www.telekoplus.com hosted in Serbia Thank you in advance Beket Borocki P.S Sorry if i have mistakes in writing on English.
Web Design | | telekoplus0 -
Can i do this? Will Google penalize me?
I have a page for a Criminal Defense Attorney and i set up a list of the type of criminal charges he is certified to deal with. I wanted to use title tags and put the Keyword "Miami Criminal Defense Attorney" & "Miami Traffic Defense Lawyer"... My question is will Google penalize me for plugging the same Key words over and over on the title tag for each ?? CHECK THE IMAGE to see what I'm talking about... thanks guys. x97dl
Web Design | | marig0