Showing Different Content To Members & Non-Members/Google and Cloaking Risk
-
How do we safely show logged-in members/Google one type of content on a page and logged out/non-members another kind of content without getting slammed for cloaking?
Right now we do this thing where we show Google everything on the page, but new visitors partial forum comments with the pitch to sign up and see full comments. So far, we have not gotten into trouble for this.
The new idea is to show non-members a lot of marketing messages and one kind of navigation and then once they sign up and are logged in, show different or no marketing messages and a different kind of navigation.
How do we stay out of trouble with this? Where is the cloaking line drawn? It's got me kinda nervous.
Thanks... Darcy
-
Wow...I didn't know this! Thanks Dirk for putting me in the 5000 Moz points club!
-
Hi Marie
Couldn't resist to like this - I noticed that you were only missing one like to reach the Moz Walhalla...
Congrats,
Dirk
-
I agree with Dirk. This sounds like cloaking. It would be best to only show Google the content that non-members can see.
If you show Google content that a non-member can't see, then this is cloaking and could get you penalized. But, even if it doesn't get you penalized, it's possible it could get you into Panda trouble. Let's say I am searching for something and I see a Google result that shows me that your site has the answer to my query. I click on your site and realize that I can only see this content if I'm a member. I don't want to become a member, so I click away and find another site to read. If enough users do this, then this is a signal to Google (and likely to Panda) that readers don't like your site.
-
Hi Darcy,
If you apply the strict definition of Google, you are "inserting text or keywords into a page only when the User-agent requesting the page is a search engine, not a human visitor" - even if you don't do it with the intention to trick search engines (the inserted text = text which is invisible for non-registered users).
Is there a way to show the same content to both bots & humans, and still keeping the page
- attractive enough for search engines
- teasing enough for humans to register
It's difficult to guess the level of risk you're running - but once penalised, traffic drop is huge & recovery takes normally a long time (with no guarantee of full recovery)
rgds
Dirk
-
Hi Dirk,
Thanks for the response. Folks out of Google do not see the full page that Google saw. They see a snippet of comments and a pitch to log in or register to see full comments (in a forum). They don't see different content right now... they see less content, but the same as Google saw. Is that clearer?
Thanks... Darcy
-
Hi Darcy,
When people click on the results in Google - do they see the normal page (the one that Googlebot saw) or the version for the "new" users. If it's the second case - you are indeed cloaking according to Google's definition (https://support.google.com/webmasters/answer/66355).
If you're listed in Google News - you could participate in "First Click Free" (https://support.google.com/news/publisher/answer/40543?hl=en) - which basically allows you to hide your content behind a registration wall but still be indexed as long as you provide at least 5 pages (articles) /day
Not all participants to First Click Free are playing according to the rules (http://searchengineland.com/google-fails-enforce-first-click-free-203078) - but I guess your site isn't the Financial times.
You could continue what you're doing now, but you certainly run the risk of a penalty in my opinion
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does link position matter in the content/html code
My question is that if I have several links going to different landing pages will the one at the top of the content pass more value than ones at the bottom. Assuming that there are not more than 1 of the same link in the content. The ultimate question is whether or not link position in the content/html code make a difference if it passes more value. This question comes in response to this whiteboard Friday https://www.youtube.com/watch?v=xAH762AqUTU Rand talks about how if there are 2 links going to the same URL from the same content page then google will only inherit the value of the anchor text from the first link on the page and not the both of them. Meaning that google will treat that second link as if it doesn’t exist. There are lots of resources that shows this was true but there isn’t much content newer than 2010 that say this is still true, We all know that things have changed a lot since then Does that make sense?
Intermediate & Advanced SEO | | 97th_Floor0 -
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
Doing large scale visual link/content analysis
Hi i currently have a list of about 5000 URLs i want to visually check quickly, to identify decent content. I'm currently opening 200 at a time with firefox, more than 200 it gets really choppy and slow as you would expect. I was wondering if anyone knew any other ways of opening a large amount of web pages. It would be sweet if there was a tool which can scan a list, add the webpages to a pdf/powerpoint and send them back to you for analysis. Kind Regards, Chris
Intermediate & Advanced SEO | | Mikey0080 -
Does blocking foreign country IP traffic to site, hurt my SEO / US Google rankings?
I have a website is is only of interest to US visitors. 99% (at least) of Adsense income is from the US. But I'm getting constant attempts by hackers to login to my admin account. I have countermeasures fo combat that and am initiating others. But here's my question: I am considering not allowing any non US, or at least any non-North American, traffic to the site via a Wordpress plugin that does this. I know it will not affect my business negatively, directly. However, are there any ramifications of the Google bots of these blocked countries not being able to access my site? Does it affect the rankings of my site in the US Google searches. At the very least I could block China, Russia and some eastern European countries.
Intermediate & Advanced SEO | | bizzer0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
News sites & Duplicate content
Hi SEOMoz I would like to know, in your opinion and according to 'industry' best practice, how do you get around duplicate content on a news site if all news sites buy their "news" from a central place in the world? Let me give you some more insight to what I am talking about. My client has a website that is purely focuses on news. Local news in one of the African Countries to be specific. Now, what we noticed the past few months is that the site is not ranking to it's full potential. We investigated, checked our keyword research, our site structure, interlinking, site speed, code to html ratio you name it we checked it. What we did pic up when looking at duplicate content is that the site is flagged by Google as duplicated, BUT so is most of the news sites because they all get their content from the same place. News get sold by big companies in the US (no I'm not from the US so cant say specifically where it is from) and they usually have disclaimers with these content pieces that you can't change the headline and story significantly, so we do have quite a few journalists that rewrites the news stories, they try and keep it as close to the original as possible but they still change it to fit our targeted audience - where my second point comes in. Even though the content has been duplicated, our site is more relevant to what our users are searching for than the bigger news related websites in the world because we do hyper local everything. news, jobs, property etc. All we need to do is get off this duplicate content issue, in general we rewrite the content completely to be unique if a site has duplication problems, but on a media site, im a little bit lost. Because I haven't had something like this before. Would like to hear some thoughts on this. Thanks,
Intermediate & Advanced SEO | | 360eight-SEO
Chris Captivate0 -
How to get the 'show map of' tag/link in Google search results
I have 2 clients that have apparently random examples of the 'show map of' link in Google search results. The maps/addresses are accurate and for airports. They are both aggregators, they service the airports e.g. lax airport shuttle (not actual example) BUT DO NOT have Google Place listings for these pages either manually OR auto populated from Google, DO NOT have the map or address info on the pages that are returned in the search results with the map link. Does anyone know how this is the case? Its great that this happens for them but id like to know how/why so I can replicate across all their appropriate pages. My understanding was that for this to happen you HAD to have Google Place pages for the appropriate pages (which they cant do as they are aggregators). Thanks in advance, Andy
Intermediate & Advanced SEO | | AndyMacLean0