Indexing content behind a login
-
Hi,
I manage a website within the pharmaceutical industry where only healthcare professionals are allowed to access the content. For this reason most of the content is behind a login.
My challenge is that we have a massive amount of interesting and unique content available on the site and I want the healthcare professionals to find this via Google!
At the moment if a user tries to access this content they are prompted to register / login. My question is that if I look for the Google Bot user agent and allow this to access and index the content will this be classed as cloaking? I'm assuming that it will.
If so, how can I get around this? We have a number of open landing pages but we're limited to what indexable content we can have on these pages!
I look forward to all of your suggestions as I'm struggling for ideas now!
Thanks
Steve
-
Thanks everyone... It's not as restrictive as patient records... Basically, because of the way our health service works in the UK we are not allowed to promote material around our medicines to patients, it should be restricted only to HCP's. If we are seen to be actively promoting to patients we run the risk of a heavy fine.
For this reason we need to take steps to ensure that we only target this information towards HCP's and therefore we require them to register before being able to access the content...
My issue is that HCP's may search for a Brand that we supply but we have to be very careful what Brand information we provide outside of log-in. Therefore the content we can include on landing pages cannot really be optimised for the keywords that they are searching for! Hence why I want the content behind log-in indexed but not easily available without registering...
It's a very difficult place to be!
-
I guess I was just hoping for that magic answer that doesn't exist! It's VERY challenging to optimise a site with these kinds of restrictions but I get I just need to put what I can on the landing pages and optimise as best I can with the content I can show!
We also have other websites aimed at patients where all the content is open so I guess I'll just have to enjoy optimising these instead
Thanks for all your input!
Steve
-
Steve,
Yes that would be cloaking. I wouldn't do that.
As Pete mentioned below, your only real options at this point are to make some of the content, or new content, available for public use. If you can't publish abstracts at least, then you'll have to invest in copywriting content that is legally available for the public to get traffic that way, and do your best to convert them into subscribers.
-
Hi Steve
If it can only be viewed legally by health practitioners who are members of your site, then it seems to me you don't have an option as by putting any of this content into the public domain on Google by whatever method you use will be deemed illegal by whichever body oversees it.
Presumably you cannot also publish short 25o word summaries of the content?
If not, then I think you need to create pages that are directly targeted at marketing the site to health practitioners. Whilst the pages won't be able to contain the content you want to have Google index, they could still contain general information and the benefits of becoming a subscriber.
Isn't that the goal of the site anyway, i.e. to be a resource to health practitioners? So, without being able to make the content public, you have to market to them through your SEO or use some other form or indirect or direct marketing to encourage them to the site to sign up.
I hope that helps,
Peter -
Thanks all... Unfortunately it is a legal requirement that the content is not made publicly available but the challenge then is how do people find it online!
I've looked at first click free and pretty much ever other option I could think of and yet to find a solution
My only option is to allow Google Bot through the authentication which will allow it to index the content but my concern is that this is almost certainly cloaking...
-
Please try looking at "First Click Free" by Google
https://support.google.com/webmasters/answer/74536?hl=en
I think this is along the lines of what you are looking for.
-
Hi Steve
As you already know, if a page is not crawlable it's not indexable. I don't think there is any way around this without changing the strategy of the site. You said, _"We have a number of open landing pages but we're limited to what indexable content we can have on these pages". _Is that limitation imposed by a legal requirement or something like that, or by the site owners because they don't want to give free access?
If the marketing strategy for the site is to grow the membership, then as it's providing a content service to its members then it has to give potential customers a sample of its wares.
I think there are two possible solutions.
(1) increase the amount of free content available on the site to give the search engines more content to crawl and make available to people searching or
(2) Provide a decent size excerpt, say the first 250 words of each article as a taster for potential customers and put the site login at the point of the "read more". That way you give the search engines something to get their teeth into which is of a decent length but it's also a decent size teaser to give potential customers an appetite to subscribe.
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Referral source not indexed or showing up in GSC
I've been doing a lot of research about this and have not been able to find an answer just yet. Google analytics is showing over 43k referrals from about 35 different spam sources. I checked the hostname thinking that they were ghost referrals and I was surprised to see that they all show our domain so that part is disqualified. The next thing I did was to look at the referral path to look at the pages that were pointing to the site and when I clicked to launch the link the window loaded YouTube or did not load at all. After doing a bit of research I came across **Disavowing Links, **at first it sounded like the perfect solution for this, but after reading all the warnings that everyone gives I decided to spend more time researching and to use that as a last resource. I proceeded to check Google Search Console to identify those backlinks and to make sure they were coming up there as well. To my surprise, none of these links show up in GSC. Neither for the www or the non-www property. I have decided to avoid disavowing the links before making sure that this is the correct thing to do. Although it may still seem like it is, I want to ask for an expert opinion or if anyone else has experienced this. If GSC doesn't see them it means that Google is not indexing them, my problem is that GA still sees them and that concerns me. I don't want this to affect our site by getting penalized, or by losing ranking. Please help!
White Hat / Black Hat SEO | | dbmiglpz0 -
How do I optimize pages for content that changes everyday?
Hi Guys I run daily and weekend horoscopes on my site, the daily horoscopes are changing every day for obvious reasons, and the weekend horoscopes change every weekend. However, I'm stuck in how the pages need to be structured. I also don't know how I should go about creating title tags and meta tags for content that changes daily. Each daily and weekend entry creates a new page. As you can see here http://bit.ly/1FV6x0y you can see todays horoscope. Since our weekend horoscopes cover Friday Sat and Sunday, there is no daily for Friday, so it shows duplicate pages across Friday, Sat and sunday. If you click on today, tomorrow and weekend all pages showing are duplicate and this will happen for each star sign from Fri, Sat Sun. My question is, will I be penalized doing this? Even if the content changes? How can I optimize the Title Tags and Meta Tags for pages that are constantly changing? I'm really stuck on this one and would appreciate some feedback into this tricky beast. Thanks in advance
White Hat / Black Hat SEO | | edward-may0 -
Page not being indexed or crawled and no idea why!
Hi everyone, There are a few pages on our website that aren't being indexed right now on Google and I'm not quite sure why. A little background: We are an IT training and management training company and we have locations/classrooms around the US. To better our search rankings and overall visibility, we made some changes to the on page content, URL structure, etc. Let's take our Washington DC location for example. The old address was: http://www2.learningtree.com/htfu/location.aspx?id=uswd44 And the new one is: http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training All of the SEO changes aren't live yet, so just bear with me. My question really regards why the first URL is still being indexed and crawled and showing fine in the search results and the second one (which we want to show) is not. Changes have been live for around a month now - plenty of time to at least be indexed. In fact, we don't want the first URL to be showing anymore, we'd like the second URL type to be showing across the board. Also, when I type into Google site:http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training I'm getting a message that Google can't read the page because of the robots.txt file. But, we have no robots.txt file. I've been told by our web guys that the two pages are exactly the same. I was also told that we've put in an order to have all those old links 301 redirected to the new ones. But still, I'm perplexed as to why these pages are not being indexed or crawled - even manually submitted it into Webmaster tools. So, why is Google still recognizing the old URLs and why are they still showing in the index/search results? And, why is Google saying "A description for this result is not available because of this site's robots.txt" Thanks in advance! Pedram
White Hat / Black Hat SEO | | CSawatzky0 -
What happens when content on your website (and blog) is an exact match to multiple sites?
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example: http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/ If you google the title of that blog article you find tons of the same article all over the place. So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
White Hat / Black Hat SEO | | MorganPorter0 -
I'm worried my client is asking me to post duplicate content, am I just being paranoid?
Hi SEOMozzers, I'm building a website for a client that provides photo galleries for travel destinations. As of right now, the website is basically a collection of photo galleries. My client believes Google might like us a bit more if we had more "text" content. So my client has been sending me content that is provided free by tourism organizations (tourism organizations will often provide free "one-pagers" about their destination for media). My concern is that if this content is free, it seems likely that other people have already posted it somewhere on the web. I'm worried Google could penalize us for posting content that is already existent. I know that conventionally, there are ways around this-- you can tell crawlers that this content shouldn't be crawled-- but in my case, we are specifically trying to produce crawl-able content. Do you think I should advise my client to hire some bloggers to produce the content or am I just being paranoid? Thanks everyone. This is my first post to the Moz community 🙂
White Hat / Black Hat SEO | | steve_benjamins0 -
Does having the same descrition for different products a bad thing the titles are all differnent but but they are the same product but with different designs on them does this count as duplicate content?
does having the same description for different products a bad thing the titles are all different but but they are the same product but with different designs on them does this count as duplicate content?
White Hat / Black Hat SEO | | Casefun1 -
Showing pre-loaded content cloaking?
Hi everyone, another quick question. We have a number of different resources available for our users that load dynamically as the user scrolls down the page (like Facebook's Timeline) with the aim of improving page load time. Would it be considered cloaking if we had Google bot index a version of the page with all available content that would load for the user if he/she scrolled down to the bottom?
White Hat / Black Hat SEO | | CuriosityMedia0 -
Multiple doamin with same content?
I have multiple websites with same content such as http://www.example.com http://www.example.org and so on. My primary url is http://www.infoniagara.com and I also placed a 301 on .org. Is that enough to keep away my exampl.org site from indexing on google and other search engines? the eaxmple.org also has lots of link to my old html pages (now removed). Should i change that links too? or will 301 redirection solve all such issues (page not found/crawl error) of my old webpages? i would welcome good seo practices regarding maintaining multiple domains thanks and regards
White Hat / Black Hat SEO | | VipinLouka780