How much javascript does Googlebot read
-
We have a site where we have certain navigational links solely for the human user. These links help the user experience and lead to pages that we don't need crawled by googlebot. We have these links in javascript so if you disable javascript these links are invisible. Will these links be considered cloaking even though our intention is not to cloak but save our Google crawl for pages we do want indexed?
-
Hi CruiseControl, If you want to see how Google views your website you can download a tool called Lynx, Lynx is a text based browser which is very very similar to how Google's crawler views your website.
-
Thank you all for your input.
-
I wrote up a nice reply then decided to investigate a point and found a nice interview with Matt Cutts from 2010. The relevant quotes are:
Matt Cutts: For a while, we were scanning within JavaScript, and we were looking for links. Google has gotten smarter about JavaScript and can execute some JavaScript. I wouldn't say that we execute all JavaScript, so there are some conditions in which we don't execute JavaScript.
Eric Enge: If someone did choose to do that (JavaScript encoded links or use an iFrame), would that be viewed as a spammy activity or just potentially a waste of their time?
Matt Cutts: I am not sure that it would be viewed as a spammy activity, but the original changes to NoFollow to make PageRank Sculpting less effective are at least partly motivated because the search quality people involved wanted to see the same or similar linkage for users as for search engines. In general, I think you want your users to be going where the search engines go, and that you want the search engines to be going where the users go.
Article link: http://www.stonetemple.com/articles/interview-matt-cutts-012510.shtml
-
There are circumstances where you are allowed to use 'cloaking' as some very influential websites have done however in your particular situation a nofollow tag and noindex tag would be the 'normal' procedure.
Personally, I think it is a grey area. You are not using the javascript to hide content as such and provided you are clearly not trying to manipulate the system there should be no reason why you would be penalised for it.
-
I would say yes they are cloaked links. I would suggest using HTML links only for maximum juice and to not anger the Googlebot. Serving different content to the user with and without javascript is a no-no. As for your crawl budget - best practice is to use a nofollow tag on the link and a noindex on the target page if you don't want it in the SERPS.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How handle pages with "read more" text query strings?
My site has hundreds of keyword content landing pages that contain one or two sections of "read more" text that work by calling the page and changing a ChangeReadMore variable. This causes the page to currently get indexed 5 times (see examples below plus two more with anchor tag set to #sectionReadMore2 This causes Google to include the first version of the page which is the canonical version and exclude the other 4 versions of the page. Google search console says my site has 4.93K valid pages and 13.8K excluded pages. My questions are: 1. Does having a lot of excluded pages which are all copies of included pages hurt my domain authority or otherwise hurt my SEO efforts? 2. Should I add a rel="nofollow" attribute to the read more link? If I do this will Google reduce the number of excluded pages? 3. Should I instead add logic so the canonical tag displays the exact URL each time the page re-displays in another readmore mode? I assume this would increase my "included pages" and decrease the number of "excluded pages". Would this somehow help my SEO efforts? EXAMPLE LINKS https://www.tpxonline.com/Marketplace/Used-AB-Dick-Presses-For-Sale.asp https://www.tpxonline.com/Marketplace/Used-AB-Dick-Presses-For-Sale.asp?ChangeReadMore=More#sectionReadMore1 https://www.tpxonline.com/Marketplace/Used-AB-Dick-Presses-For-Sale.asp?ChangeReadMore=Less#sectionReadMore1
Technical SEO | | DougHartline0 -
Javascript is redirecting search bots to mobile
Here is the javascript I am using to send users to the mobile version of my website: This is causing major issues in Bing and Yahoo as the mobile website is the only thing ranking. I'd love any help dissecting this issue. Thanks in advance.
Technical SEO | | ShawnW0 -
Regarding Flat site structure and how much flat ?
Is it nice to place all commercial landing pages ( product pages) just below the root domain though their number would be around 50k ? Many eCommerce sites does this so that the root domain link juice flow directly to the product page. What's your say on this ?
Technical SEO | | priyabharat0 -
When you change your domain, How much time do I have to wait for google to return the traffic used to have?
Hello. 20 days ago, I changed my domain from uclasificados.net to uclasificados.com doing redirect 301 to all urls, and I started to loose rankings since that moment. I was wondering if changing it back could be the solutions, but some experts recommend me not to do that, because it could be worse. Right now I receave almost 50% of traffic I used to receave before, and I have done a lot of linkbuilding strategies to recover but nothing have worked until now. Even though I notified google of this change and I send again my new sitemap, I don't see that have improve my situation in any aspects, and I still see in webmastertools search stats from my last website (the website who used to be uclasificados.com before the change). What should I do to recover faster?
Technical SEO | | capmartin850 -
When the Plural has more traffic, but the singular makes much more sense. What to do?
Hey everyone! This is actually the first time I ever posted a question here on MOZ! Guess I was (still am) embarrassed by being an SEO Noob! That being said, I really have to get some input on this matter and i was wondering if you guys might be able to help. I'm optimizing a page for a wedding venue in Portugal. Currently, according to google trends the Plural - Venues for weddings, scores considerably better than the Singular, Venue for weddings(this was researched in Portuguese written terms of course). Despite this, i'm leaning towards an optimization for the Singular term, because the plural seems to un-natural to fit in the content, or title. I managed to fit the Plural in the description but i've read that it hasn't influenced rank directly for a while. Currently my title tag reads: Venue for Weddings | Name of the Venue. I really can't find anyway that it makes sense to me in the Plural... and i feel like if i was a user, i would rather click on the singular term cause it just makes a lot more sense. But my opinion is most probably biased by the fact that i understand that using the plural term will be solemnly an SEO effort to rank higher for a term that has more average searches per month. My question is: In the current state of search algorithms, will an optimization for the singular term, still get me some rank on the plural key phrase? Let me know what you think about this please, and thank you in advance for your time. Most Respectfully, Martim Coutinho dos Santos
Technical SEO | | martim_santos0 -
Images on page appear as 404s to Googlebot
When I fetch my website as Googlebot it returns 404s for all the images on the page. This despite the fact that each image is hyperlinked! What could be causing this issue? Thanks!
Technical SEO | | Netpace0 -
How much of an issue is it if a site is somehow connected to a site that was penalized by Google?
I am working with someone that is about to launch a new site, and one of the sites was affected by the Panda update. Does it matter if the two sites are connected? Share the same hosting provider and same Google Webmaster's account?
Technical SEO | | nicole.healthline0 -
Reading Crawl Diagnostics and Taking Action on results
My site crawl diagnostics are showing a high number of duplicate page titles and content. When i look at the flagged pages, many errors are simply listed from multiple pages of product category search results. This looks pretty normal to me and I am at a loss for understanding how to fix this situation. Can I talk with someone? thanks, Gary
Technical SEO | | GaryQ0