Googlebot crawl error Javascript method is not defined
-
Hi All,
I have this problem, that has been a pain in the ****. I get tons of crawl errors from "Googlebot" saying a specific Javascript method does not exist in my logs. I then go to the affected page and test in a web browser and the page works without any Javascript errors.
Can some help with resolving this issue?
Thanks in advance.
-
Can you post a log file?
if you don’t want the domain shown search & replace it with example.com ?
or show us a photo of the problem?
or use screaming Frog to run a Java test?
-
I agree With Effectdigital we would need to see a copy of the page to be able to help with that.
It's more common than you think.
Maybe share the page from Google search console?
Hope this helps,
Tom
-
I think with this being such a niche query, we'd really need to see an example of a page which is triggering the error - to even attempt to help!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Images, CSS and Javascript on subdomain or external website
Hi guy's, I came across webshops that put images, CSS and Javascript on different websites or subdomains. Does this boost SEO results? On our Wordpress webshop all the sourcescodes are placed after our own domainname: www.ourdomainname.com/wp-includes/js/jquery/jquery.js?ver=1.11.3'
Technical SEO | | Happy-SEO
www.ourdomainname.com/wp-content/uploads/2015/09/example.jpg Examples of other website: Website 1:
https://www.zalando.nl/heren-home/ Sourcecode:
https://secure-i3.ztat.net//camp/03/d5/1a0168ac81f2ffb010803d108221.jpg
https://secure-media.ztat.net/media/cms/adproduct/ad-product.min.css?_=1447764579000 Website 2:
https://www.bol.com/nl/index.html Sourcecode:
https://s.s-bol.com/nl/static/css/main/webselfservice.1358897755.css
//s.s-bol.com/nl/upload/images/logos/bol-logo-500500.jpg Website 3:
http://www.wehkamp.nl/ Sourcecode:
https://static.wehkamp.nl/assets/styles/themes/wehkamp.color.min.css?v=f47bf1
http://assets.wehkamp.com/i/wehkamp/350-450-layer-SDD-wk51-v3.jpg0 -
Absurdly High Crawl Stats
Over the past month and a half, our crawl stats have been rising violently. A few weeks ago, our crawl stats rose, such that the pages crawled per day worked out to the entire site being crawled 6 times a day, with a corresponding rise in KB downloaded per day. Last week, the crawl rate jumped again, such that the site is being crawled roughly 30x a day. I'm not seeing any chatter at there about an algorithm change, and I've checked and double-checked the site for signs of duplicate content, changes in our backlink profile, or anything else. We haven't seen appreciable changes in our search volume, either impressions or clicks. Any ideas what could be going on?
Technical SEO | | Tyler-Brown0 -
Noticed a lot of duplicate content errors...
how do I fix duplicate content errors on categories and tags? I am trying to get rid of all the duplicate content and I'm really not sure how to. Any suggestions, advice and/or help on this would be greatly appreciated. I did add the canonical url through the SEO Yoast plugin, but I am still seeing errors. I did this on over 200 pages. Thanks for any assistance in advance. Jaime
Technical SEO | | slapshotstudio0 -
403 error
Hey guys, I know that a 403 is not a terrible thing, but is it worth while fixing? If so what is the best way to approach it. Cheers
Technical SEO | | Adamshowbiz0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Google Webmasters News Errors ressolution
Hello to the community, i had a sudden increase from just a couple to 50 someting Google Webmaster News Errors. The two areas affected are Content of article and date of article.I found a very good article in SEOMoz about Google Webmasters, but it was published before the changes early last year were done in Google Webmasters. http://www.seomoz.org/blog/how-to-fix-crawl-errors-in-google-webmaster-tools The people that have been asking the same question in the internet have not yet received replies from Google and the Google support replies dont make it really clear. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93994 Any views experiences with this. My site is in Google News, but we do not have a Google News Sitemap. Thanks, Polar
Technical SEO | | Polarstar0 -
Increase in authorization permission errors error after site switch
We launched our new site 2 days ago , since site was down for 12 hours for maintenance, we saw google webmaster tool shows this error . Since then google hasnt crawled, its been 36 hours. Do we need to do anyting? We have close to a million page google crawled before and I am wondering if this will effect anything.
Technical SEO | | tpt.com0 -
Javascript void and PageRank
Do javascript void links to on-page elements (not to a new page) consume PageRank? I'm paring down links on a client's homepage, and we have javascript void links (wrapped in <a href="">) that load videos, elements of a slider, etc. on the page itself.</a> <a href="">Basically, if I have a bunch of these, is it going to weaken the power of the other links on the page?</a>
Technical SEO | | LCNetwork1