Indexing content behind a login
-
Hi,
I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login.
My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google!
At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will.
If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages!
I look forward to all of your suggestions as I'm struggling for ideas now!
Thanks
Steve
-
Thanks everyone... It's not as restrictive as patient records... Basically, because of the way our health service works in the UK we are not allowed to promote material around our medicines to patients, it should be restricted only to HCP's. If we are seen to be actively promoting to patients we run the risk of a heavy fine.
For this reason we need to take steps to ensure that we only target this information towards HCP's and therefore we require them to register before being able to access the content...
My issue is that HCP's may search for a Brand that we supply but we have to be very careful what Brand information we provide outside of log-in. Therefore the content we can include on landing pages cannot really be optimised for the keywords that they are searching for! Hence why I want the content behind log-in indexed but not easily available without registering...
It's a very difficult place to be!
-
I guess I was just hoping for that magic answer that doesn't exist! It's VERY challenging to optimise a site with these kinds of restrictions but I get I just need to put what I can on the landing pages and optimise as best I can with the content I can show!
We also have other websites aimed at patients where all the content is open so I guess I'll just have to enjoy optimising these instead
Thanks for all your input!
Steve
-
Steve,
Yes that would be cloaking. I wouldn't do that.
As Pete mentioned below, your only real options at this point are to make some of the content, or new content, available for public use. If you can't publish abstracts at least, then you'll have to invest in copywriting content that is legally available for the public to get traffic that way, and do your best to convert them into subscribers.
-
Hi Steve
If it can only be viewed legally by health practitioners who are members of your site, then it seems to me you don't have an option as by putting any of this content into the public domain on Google by whatever method you use will be deemed illegal by whichever body oversees it.
Presumably you cannot also publish short 25o word summaries of the content?
If not, then I think you need to create pages that are directly targeted at marketing the site to health practitioners. Whilst the pages won't be able to contain the content you want to have Google index, they could still contain general information and the benefits of becoming a subscriber.
Isn't that the goal of the site anyway, i.e. to be a resource to health practitioners? So, without being able to make the content public, you have to market to them through your SEO or use some other form or indirect or direct marketing to encourage them to the site to sign up.
I hope that helps,
Peter -
Thanks all... Unfortunately it is a legal requirement that the content is not made publicly available but the challenge then is how do people find it online!
I've looked at first click free and pretty much ever other option I could think of and yet to find a solution
My only option is to allow Google Bot through the authentication which will allow it to index the content but my concern is that this is almost certainly cloaking...
-
Please try looking at "First Click Free" by Google
https://support.google.com/webmasters/answer/74536?hl=en
I think this is along the lines of what you are looking for.
-
Hi Steve
As you already know, if a page is not crawlable it's not indexable. I don't think there is any way around this without changing the strategy of the site. You said, _"We have a number of open landing pages but we're limited to what indexable content we can have on these pages". _Is that limitation imposed by a legal requirement or something like that, or by the site owners because they don't want to give free access?
If the marketing strategy for the site is to grow the membership, then as it's providing a content service to its members then it has to give potential customers a sample of its wares.
I think there are two possible solutions.
(1) increase the amount of free content available on the site to give the search engines more content to crawl and make available to people searching or
(2) Provide a decent size excerpt, say the first 250 words of each article as a taster for potential customers and put the site login at the point of the "read more". That way you give the search engines something to get their teeth into which is of a decent length but it's also a decent size teaser to give potential customers an appetite to subscribe.
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
Internal Links & Possible Duplicate Content
Hello, I have a website which from February 6 is keep losing positions. I have not received any manual actions in the Search Console. However I have read the following article a few weeks ago and it look a lot with my case: https://www.seroundtable.com/google-cut-down-on-similar-content-pages-25223.html I noticed that google has remove from indexing 44 out of the 182 pages of my website. The pages that have been removed can be considered as similar like the website that is mentioned in the article above. The problem is that there are about 100 pages that are similar to these. It is about pages that describe the cabins of various cruise ships, that contain one picture and one sentence of max 10 words. So, in terms of humans this is not duplicate content but what about the engine, having in mind that sometimes that little sentence can be the same? And let’s say that I remove all these pages and present the cabin details in one page, instead of 15 for example, dynamically and that reduces that size of the website from 180 pages to 50 or so, how will this affect the SEO concerning the internal links issue? Thank you for your help.
White Hat / Black Hat SEO | | Tz_Seo0 -
Technical : Duplicate content and domain name change
Hi guys, So, this is a tricky one. My server team just made quite a big mistake :We are a big We are a big magento ecommerce website, selling well, with about 6000 products. And we are about to change our domaine name for administrative reasons. Let's call the current site : current.com and the future one : future.com Right, here is the issue Connecting to the search console, I saw future.com sending 11.000 links to current.com. At the same time DA was hit by 7 points. I realized future.com was uncorrectly redirected and showed a duplicated site or current.com. We corrected this, and future.com now shows a landing page until we make the domain name change. I was wondering what is the best way to avoid the penalty now and what can be the consequences when changing domain name. Should I set an alias on search console or something ? Thanks
White Hat / Black Hat SEO | | Kepass0 -
Referral source not indexed or showing up in GSC
I've been doing a lot of research about this and have not been able to find an answer just yet. Google analytics is showing over 43k referrals from about 35 different spam sources. I checked the hostname thinking that they were ghost referrals and I was surprised to see that they all show our domain so that part is disqualified. The next thing I did was to look at the referral path to look at the pages that were pointing to the site and when I clicked to launch the link the window loaded YouTube or did not load at all. After doing a bit of research I came across **Disavowing Links, **at first it sounded like the perfect solution for this, but after reading all the warnings that everyone gives I decided to spend more time researching and to use that as a last resource. I proceeded to check Google Search Console to identify those backlinks and to make sure they were coming up there as well. To my surprise, none of these links show up in GSC. Neither for the www or the non-www property. I have decided to avoid disavowing the links before making sure that this is the correct thing to do. Although it may still seem like it is, I want to ask for an expert opinion or if anyone else has experienced this. If GSC doesn't see them it means that Google is not indexing them, my problem is that GA still sees them and that concerns me. I don't want this to affect our site by getting penalized, or by losing ranking. Please help!
White Hat / Black Hat SEO | | dbmiglpz0 -
Cross Domain Duplicate Content
Hi, We want create 2 company websites and each to be targeted specific to different countries. The 2 countries are Australia and New Zealand. We have acquired 2 domains, company.com.au and company.co.nz . We want to do it like this and not use different hreflang on the same version for maximum ranking results in each country (correct?). Since both websites will be in English, inevitably some page are going to be the same. Are we facing any danger of duplicate content between the two sites, and if we do is there any solution for that? Thank you for your help!
White Hat / Black Hat SEO | | Tz_Seo0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Links via scraped / cloned content
Just been looking at some backlinks on a site - a good proportion of them are via Scraped wikipedia links or sites with similar directories to those found on DMOZ (just they have different names). To be honest, many of these sites look pretty dodgy to me, but if they're doing illegal stuff there's absolutely no way I'll be able to get links removed. Should I just sit and watch the backlinks increase from these questionable sources, or report the sites to Google, or do something else? Advice please.
White Hat / Black Hat SEO | | McTaggart0 -
Possibly a dumb question - 301 from a banned domain to new domain with NEW content
I was wondering if banned domains pass any page rank, link love, etc. My domain got banned and I AM working to get it unbanned, but in the mean time, would buying a new domain, and creating NEW content that DOES adhere to the google quality guidelines, help at all? Would this force an 'auto-evaluation' or 're-evaluation' of the site by google? or would the new domain simply have ZERO effect from the 301 unless that old domain got into google's good graces again.
White Hat / Black Hat SEO | | ilyaelbert0