Is there such thing as white hat cloaking?
-
We are near the end of a site redesign and come to find out its in javascript and not engine friendly. Our IT teams fix to this is show crawlable content to googlebot and others through the user agents. I told them this is cloaking and I'm not comfortable with this. They said after doing research, if the content is pretty much the same, it is an acceptable way to cloak. About 90% of the content will be the same between the "regular user" and content served to googlebot. Does anyone have any experience with this, are there any recent articles or any best practices on this?
Thanks!
-
We have the same issue with our site HelloCoin, its pure ajax/javascript so we make a second no javascript version for every page for googlebot to crawl it, we just make it as much as possible similar to the original (user version). Just don't hide anything and show everything as it is, some functionality might not work but its not an issue, google just want to see how it looks for the user not how it works.
-
It is acceptable and completely common. Imagine you had a 100% flash site. The bots can figure out some of the content, but not a lot, so they actually need you to serve up a different version of your site so that they know what's there and can index you properly. As long as the content is the same, it shouldn't be an issue.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
White hat or black hat?
There seems to be very differing opinions on what is good practice (white hat) and what is not (Black hat) and I'm not sure which way to lean (although my inclinations are slightly to the white). I'm starting a business offering a service and see ranking position 1-3 in the serps as my key to success. I'm creating good and useful content on my site and without much effort beyond on page seo have reached page 4 google for a few choice keywords. I feel that with a small number of links to a few of my pages i can reach page 1 and here is where my dilemma begins. With a bit of investment in some software (£400-600 for 3 different products) I can start Tiered linkbuilding (in a black hat way) and get results quickly but potentially risking my site in the eyes of google. I've been doing a little outreach to gain links in a whiter way but not had much success yet. I'm keen to keep with the whiter side but see progress as slower. Am I wrong? Can i build a robust link profile in a white hat way rapidly? Are there any quick wins i can gain to give me confidence? Why is white hat better than black hat? All wisdom, experience, guidance and humour gratefully received.
White Hat / Black Hat SEO | | roadhaulageservices0 -
Competitor Black Hat Link Building?
Hello big-brained Moz folks, We recently used Open Site Explorer to compile a list of inbound linking domains to one of our clients, alongside domains linking to a major competitor. This competitor, APBSpeakers.com, is dominating the search results with many #1 rankings for highly competitive phrases, even though their onsite SEO is downright weak. This competitor also has exponentially more links(602k vs. 2.4k) and way more content(indexed pages) reported than any of their competitors, which seems physically impossible to me. Linking root domains are shown as 667 compared to 170 for our client, who has been in business for 10+ years. Taking matters a step further, linking domains for this competitor include such authoritative domains as: Cnn.com TheGuardian.com PBS.org HuffingtonPost.com LATimes.com Time.com CBSNews.com NBCNews.com Princeton.edu People.com Sure, I can see getting a few high profile linking domains but the above seems HIGHLY suspicious to me. Upon further review, I searched CNN, The Guardian and PBS for all variations of this competitors name and domain name and found no immediate mentions of their name. I smell a rat and I suspect APB is using some sort behind-the-scenes programming to make these "links" happen, but I have no idea how. If this isn't the case, they must have a dedicated PR person with EXTREMELY strong connections to secure this links, but even this seems like a stretch. It's conceivable that APB is posting comments on all of the above sites, along with links, however, I was under the impression that all such posts were NoFollow and carried no link juice. Also, paid advertisements on the above sites should be NoFollow as well, right? Anyway, we're trying to get to the bottom of this issue and determine what's going on. If you have any thoughts or words of wisdom to help us compete with these seemingly Black Hat SEO tactics, I'd sure love to hear from you. Thanks for your help. I appreciate it very much. Eric
White Hat / Black Hat SEO | | EricFish0 -
Looking for recent bad SEO / black hat example such as JC Penney example from 2011
I am giving a presentation in a few weeks and looking for a "what not to do" larger brand example that made poor SEO choices to try and game Google with black hat tactics. Any examples you can point me to?
White Hat / Black Hat SEO | | jfeitlinger0 -
Is this a clear sign that one of our competitors is doing some serious black-hat SEO?
One of our competitors just recently increased their total external followed looks pretty drastically. Is it safe to say they are doing some pretty black-hat stuff? What actions exactly could this be attributed to? They've been online and in business for 10+ years and I've seen some pretty nasty drops in traffic on compete.com for them over the years. If this is black-hat work in action, would these two things be most likely related? Wh10b97
White Hat / Black Hat SEO | | Kibin0 -
White hat link technique to banned domain
The question is: I have branddomain A (manually penalization by google, one year ago and after 4 consideration requests and more than 3/4 of links removed, stills banned) authority 42 And and new branddomain B (with fresh content created after penalization in the case of no recovery as it happen) authority 26 There are no links from A to B, both are now with same traffic but i want people that find me on domain A (partial penalized) to come to my new web brand. Both domains have same name, different extensión. So the question: Can i link with photo domain A to domain B, if i place nofollow and no ancor text on those linked photos. I want to have my traffic unified and i dont want to go against google guidelines
White Hat / Black Hat SEO | | maestrosonrisas0 -
Can our white hat links get a bad rap when they're alongside junk links busted by Panda?
My firm has been creating content for a client for years - video, blog posts and other references. This client's web vendor has been using bad links and link farms to bolster rank for key phrases - successfully. Until last week when Google slapped them. They have been officially warned on WMT for possibly using artificial or unnatural links to build PageRank. They went from page one of the most popular term in Chicago for their industry where they had been for over a year - to page 8 - overnight. Other less generic terms that we were working on felt the sting as well. I was aware of and had warned the client of the possibility of repercussions from these black hat tactics (http://www.seomoz.org/blog/how-google-makes-liars-out-of-the-good-guys-in-seo#jtc170969), but didn't go as far as to recommend they abandon them. Now I'm wondering if one of our legitimate sites (YoChicago.com), which has more than its share of the links into the client site is being considered a bad link. All of our links are legitimate, i.e., anchor text equals description of destination, video links describe the entity that is linked to. Our we vulnerable? Any insight would be appreciated.
White Hat / Black Hat SEO | | mikescotty0 -
Disqus integration and cloaking
Hey everyone, I have a fairly specific question on cloaking and whether our integration with disqus might be viewed as cloaking. Here is the setup. We have a site that runs off of drupal and would like to convert the comment handling to disqus for ease of our users. However, when javasrcript is disabled the nice comment system and all of the comments from disqus disappear. This obviously isn't good for SEO, however the user experience using disqus is way better than the native comment system. So here is how we are addressing the problem. With drupal we can sync comments between the native comment system and disqus. When a user has javascript enabled the containing div for the native comment system is set to display:none. hiding the submission form and all of the content and instead displaying it through the disqus interface. However when javascrip is not enabled the native comment form and the comments will be available to the user. Could this be considered cloaking by google? I know they do not like hidden div's, but it should be almost exactly the same content being displayed to the user (depending on when the last sync was run). Thanks for your thoughts, and if anyone has familiarity with a better way to integrate drupal and disqus I am all ears. Josh
White Hat / Black Hat SEO | | prima-2535090