Different user experience with javascript on/off
-
I was wondered if the site is serving different user experience when JS is disabled is sort of cloaking
-
Yes, Dan is correct. As long as the intent is not malicious, you should be good. Moreover, it is a common practice where a JS overlay is displayed before the actual site is served. For example, the adult sites and liquor related websites show an age gate page using JS overlay technique for the human visitors to confirm their age before they can access the website but the search engines bots (like the Google bot) do not see the JS overlay and can directly access the website. With this kind of setup in place, there is nothing to worry about different experience being served to visitors and bots. This is definitely not considered, cloaking. Hope it helps.
Best regards,
Devanur Rafi
-
This is probably not cloaking as long as it's not malicious. Engines are going to have a non-js view by default for the most part, so that is the version they will see anyway. You can check Google text-only cache of the page to see how they are seeing it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googlebot crawl error Javascript method is not defined
Hi All, I have this problem, that has been a pain in the ****. I get tons of crawl errors from "Googlebot" saying a specific Javascript method does not exist in my logs. I then go to the affected page and test in a web browser and the page works without any Javascript errors. Can some help with resolving this issue? Thanks in advance.
Technical SEO | | FreddyKgapza0 -
Heroku, Subdomains, redirecting to different servers & SEO
Hi all, I've got a big question. I'm helping out with digital marketing at a non-profit that I volunteer with. We're working on developing an on site blog for our site, because we've realized we need that middle of the funnel content. Our issue has been taking up much of the developers' time & we're really working out the kinks with the system, so I'm trying to alleviate that with a light blog solution. We're considering installing a wordpress site on a separate server from the main site & sending traffic over to that. The primary questions: Do we negatively impact our main site by having a link to a different domain in the topnav? Do we positively impact the main site by developing SEO content on a redirected&masked URL? What are the ramifications of redirecting a subdomain to another server? Would we be better off using something like Ghost? A good example would be: Visit theMainDomain
Technical SEO | | Alternatively.marketing
Click: blog on topnav
Redirect to: theSecondDomain/blog [masked as theMainDomain/blog]
Click: home on topnav
Redirect to: theFirstDomain/blog0 -
Different breadcrumbs for each productpage
Hi all, I have a question related to the breadcrumb. We have an e-commerce site. There is a difference in the breadcrumb when navigating to our products vs directly browsing to the URL of the product. When you navigate to the product the breadcrumb looks like this (also in the source code):
Technical SEO | | AMAGARD
Home > Sand > Sandpit sand > Bigbag Sandpit sand type xyz When you visit the product URL directly, the breadcrumb looks like this (also in the source code):
Home > Bigbag Sandpit sand type xyz Looks to me that can be confusing for a search engine and that it is unclear what the site's structure/hierarchy is like (and also for a user of course). Is that true? If yes, does this have a big direct negative impact looking at SEO? Thanks in advance!0 -
Setting up a site with different extensions (.co.uk and .com)
hi i am setting up a new site but have bought two domains to cover those who may type the wrong version. So i have: regionwithchildren.co.uk and regionwithchildren.com i am just setting up both on my wordpress host with a coming soon page (to include social links and sign up form). but had a few questions: as the main site is .co.uk should i just set up a redirect from the .com to the .co.uk as the root folders on the two will be the same (regionwithchildren) i need to change one as host cant have two identical - what should i change the .com one to? any other considerations for this kind of set up would be much appreciated? thanks neil
Technical SEO | | neilhenderson0 -
Javascript tabbed navigation and duplicate content
I'm working on a site that has four primary navigation links and under each is a tabbed navigation system for second tier items. The primary link page loads content for all tabs which are javascript controlled. Users will click the primary navigation item "Our Difference" (http://www.holidaytreefarm.com/content.cfm/Our-Difference) and have several options with each tabs content in separate sections. Each second tier tab is also available via sitemap/direct link (ie http://www.holidaytreefarm.com/content.cfm/Our-Difference/Tree-Logistics) without the js navigation so the content on this page is specific to the tab, not all tabs. In this scenario, will there be duplicate content issues? And, what is the best way to remedy this? Thanks for your help!
Technical SEO | | Total-Design-Shop0 -
Different version of site for "users" who don't accept cookies considered cloaking?
Hi I've got a client with lots of content that is hidden behind a registration form - if you don't fill it out you can not proceed to the content. As a result it is not being indexed. No surprises there. They are only doing this because they feel it is the best way of capturing email addresses, rather than the fact that they need to "protect" the content. Currently users arriving on the site will be redirected to the form if they have not had a "this user is registered" cookie set previously. If the cookie is set then they aren't redirected and get to see the content. I am considering changing this logic to only redirecting users to the form if they accept cookies but haven't got the "this user is registered cookie". The idea being that search engines would then not be redirected and would index the full site, not the dead end form. From the clients perspective this would mean only very free non-registered visitors would "avoid" the form, yet search engines are arguably not being treated as a special case. So my question is: would this be considered cloaking/put the site at risk in any way? (They would prefer to not go down the First Click Free route as this will lower their email sign-ups.) Thank you!
Technical SEO | | TimBarlow0 -
Page Analysis Difference Between Root and Subdomain
I have a site where the canonical version is the subdomain www, with a permanent redirect to ensure this is so. When I do a page analysis from the MozBar for the domain I see that www and *.domain are both displayed, with numbers from *.domain being shown by default in the mozbar. Does MozBar show *.domain numbers by default, and do I correctly understand that the (higher) www numbers displayed in page analysis for www are valid and a result of my canonical strategy?
Technical SEO | | waynekolenchuk0 -
Different Results in Chrome, Firefox and IE?
I clear the cache and log out from any accounts and I still get different results for the same keyword if I use different browsers. Any idea whats going on? And which browser would have my true ranking?
Technical SEO | | musillawfirm0