Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Membership/subscriber (/customer) only content and SEO best practice
-
Hello Mozzers, I was wondering whether there's any best practice guidance out there re: how to deal with membership/subscriber (existing customer) only content on a website, from an SEO perspective - what is best practice?
A few SEOs have told me to make some of the content visible to Google, for SEO purposes, yet I'm really not sure whether this is acceptable / manipulative, and I don't want to upset Google (or users for that matter!)
Thanks in advance, Luke
-
I'd say it's mostly transferable as plenty of content is found in both news and the main index. News is more of a service overlay that attempts to better handle user expectations for frequency and speed of response when it comes to news items. Still, old news gets into the index and treated like content from most any site so if you have a subscription based model that aligns with what they're recommending for more news orientated sites, at least you're fitting into a form of what they outline.
-
Everything I could find was related to Google News, but not the main index? Is it directly transferrable? Especially given it's the _oldest _content that's going to end up being paid for in my example.
-
As an example, the New York Times does this via tracking of how many full articles a user reads while allowing Googlebot full access to its articles. Sites that use this method employ "no cache" on Google so articles can't be read there and then various forms of tracking to ensure users are being counted correctly. Here are some thoughts on this and more from Google's side that might help you out: https://support.google.com/news/publisher/answer/40543. Cheers!
-
Don't want to hijack this thread at all, but I was looking for something very similar and wonder if we're thinking of the same thing?
A blog wants to make it's older content only available to premium members - but still retain a snippet of that content (perhaps the first few paragraphs (the posts are quite long) as visible to search engines. Thus allowing traffic to arrive on the site from the content, but not necessarily view it.
I saw that as being against the spirit of what Google wants to do, but was hoping for a little clarity on that. I wonder if the OP was thinking of something similar?
-
As Leonie states, the search engines are for public facing content. If your site is completely private then you'd be more interested in making sure it's not found anywhere other than by members, however it sounds like you have some aspects of the site that could be public or created to attract new members. Typically in these cases you pull small topical samples from the site that are shown to benefit the members and help articulate why membership is valuable. It may be a matter of having what is practically like two sites: the public facing, membership recruitment site, and the private, non-indexed membership site. Cheers!
-
Hi, if your whole website is for members and behind a login and password, Searchengines can't index the website and thus not visisble for others than your members.
if you want other people to find your website, you'll need a public part, which you can optimize for your users and searchengines.
the question is: do you want other people than your members find the website, if yes, than you'll need content that searchengines can find. If the answer is no you can hide the whole website behind a login and password.
i manage a website which a part of that is only for members. that part is not optimized and behind a login and password. The rest of the site is public and need to be found in the searchengines. This part is optimized for on - and off page seo.
Grtz, Leonie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it best practice to have a canonical tags on all pages
The website I'm working on has no canonical tags. There is duplicate content so rel=canonicals need adding to certain pages but is it best practice to have a tag on every page ?
Intermediate & Advanced SEO | | ColesNathan0 -
Submitting Same Press Release Content to Multiple PR Sites - Good or Bad Practice?
I see some PR (press release) sites where they distribute the same content on many different sites and at end they give the source link is that Good SEO Practice or Bad ? If it is Good Practice then how Google Panda or other algorithms consider it ?
Intermediate & Advanced SEO | | KaranX0 -
Would changing the file name of an image (not the alt attribute) have an effect of on seo / ranking of that image and thus the site?
Would changing the file name of image, not the alt attribute nor the image itself (so it would be exactly the same but just a name change) have any effect on : a) A sites seo ranking b) the individual images seo ranking (although i guess if b) would be true it would have an effect on a) although potentially small.) This is the sort of change i would be thinking of making :  changed to 
Intermediate & Advanced SEO | | Sam-P0 -
Best practice for H1 on site without H1 - Alternative methods?
I have recently set up a mens style blog - the site is made up of articles pulled in from a CMS and I am wanting to keep the design as clean as possible - so no text other than the articles. This makes it hard to get a H1 tag into the page - are there any solutions/alternatives? that would be good for SEO? The site is http://www.iamtheconnoisseur.com/ Thanks
Intermediate & Advanced SEO | | SWD.Advertising0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
How do 302 redirects from Akamai content targeting impact SEO?
How do 302 redirects from Akamai content targeting impact SEO? I'm using Akamai content targeting to get people from countries and languages to the right place (eg www.abc.123 to redirect to www.abc.123/NL-nl/default.aspx where folks from the Netherlands get their localized site in dutch) and from the edge server closest to them. As far as I know Akamai doesn't allow me to use anything but a 302. Anyone run across this? is this 302 a problem? I did a fetch as googlebot on my main domain and all I see is the Akamai 302. I can't imagine this is the first time Akamai has run across this but I would like to know for sure.
Intermediate & Advanced SEO | | Positec0 -
Duplicate Content on Wordpress b/c of Pagination
On my recent crawl, there were a great many duplicate content penalties. The site is http://dailyfantasybaseball.org. The issue is: There's only one post per page. Therefore, because of wordpress's (or genesis's) pagination, a page gets created for every post, thereby leaving basically every piece of content i write as a duplicate. I feel like the engines should be smart enough to figure out what's going on, but if not, I will get hammered. What should I do moving forward? Thanks!
Intermediate & Advanced SEO | | Byron_W0 -
/%category%/%postname%/ Permalink structure
Mostly everyone seems to agree that /%category%/%postname%/ is the best blog structure. I'm thinking of changing my structure to that because now it's structured by date which is bad. But almost all of my posts are assigned to more than one category. Won't this create duplicate pages?
Intermediate & Advanced SEO | | UnderRugSwept0