Logged In Only Content Made Available to Googlebot
-
Hi guys,
On this page, http://www.jobiness.sg/changi-airport-group/work-reviews/id-18180200170/?page=2, I require my users to sign up to be able to view the content.
I would like to make this available to search engine crawlers.
Also, are there any general guidelines regarding making this type of optimization? Is this considered acceptable within Google's guidelines?
From my research, there seems to be 3 ways to go about doing this:
- Creating an account for the bots such that they are considered 'logged in users'
- Adding checks to my html to see the http user agent
- Google click first free (havent dont much research into this yet)
-
If the content gets indexed, then it's no longer protected for others to see. They could just search for the rest in Google.
There's no point of blocking users and allowing Google to index it. Instead, you could build pages with excepts of the content, so Google AND users can see that, then users can decide to proceed as a logged in user or not.
Google offers site authentication but only for AdSense purposes. You can give them the post details they will need to access that content, but that won't change the indexing for search.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keeping SEO benefit of an old URL by changing content
We have a blog written in Oct 2012 that accounts for 30-40% of our traffic (174K pageviews per year/80% bounce rate). We are considering updating the content but are concerned that it will fall off the search engine's map if the content is updated to include information that is not exactly the same, but relevant. The URL would be the same and the original blog content would be shortened with a link to the full blog. The new content would include other FDA products under investigation. Here is the blog: http://myadvocates.com/blog/fda-issues-warning-about-so-called-brain-supplement-prevagen
On-Page Optimization | | jgodwin0 -
Duplicate content from category pages?
I have an ecommerce store with different categories for my products. Some products do appear in more than one category, is that an issue even if you end up on the same product page/link? Also, I have a "show all products" category, which I believe creates duplicate content too? What is your take on this? What can I do to solve this? Is it even an issue of duplicate content? All answers are very much appreciated!
On-Page Optimization | | danielpett0 -
What content is apropriate here
Hello, I've got a dozen good articles in my article section, but nobody is landing on them. Should we write articles about our products? Won't that compete with our product pages?
On-Page Optimization | | BobGW0 -
Localised content/pages for identical products
I've got a question about localising the website of a nationwide company. We're a small dance school with nationwide (40 cities) coverage for around 40 products. Currently, we have one page for each product (style of dance), and one page for each city; the product pages cover keywords like 'cheerleading dance class' while the city pages target the 'london dance classes'-type keywords. To make 'localised product pages', I feel like we should make a page for every city/product combo 'London cheerleading classes' - but that seems like a nightmare for both writing sexy & original content, and link building/social stats. The other thing I can think of (which I refuse to do because it would look stupid & flag the page as keyword stuffed) is filling the page with the keyword phrases which are appropriate for every city. Is there another way to let google know 'this page is appropriate for these cities...'? We do currently list the cities a product is available in, but it doesn't seem to help local rankings very much. Would this just be a link building job, using hyper-targeted anchor texts (inc. city names) for each product? How do the pro's tackle this problem?
On-Page Optimization | | AlecPR0 -
Should I use this Facebook comment content on my related blog post?
I have a blog post that ranks pretty high for the term "justin bieber tickets". We are running a ticket giveaway and have received tons of responses on Facebook and G+. The responses are often poorly written in they sense that they are from younger fans, but it is a bunch of related content that I thought could be a "good "addition of unique content to the post. Is this a good idea in general? Is it still a good idea if the comments are poorly written and contain lots of slang an exclamation points? Is it bad form to put people's Facebook comments live on the web, even though it is a public page. Here is the post Example of what this would look like in the post >http://cl.ly/1Q3N0t091V0w3m2r442G Source of comments >http://www.facebook.com/SeatGeek Another less aggressive option would be to curate some of my favorite comments... Thanks for any thoughts.
On-Page Optimization | | chadburgess0 -
Using content for cliche' terms, or content found on other sites
howdy, I have a basic question about using content found on other websites for your own use. I have started a pick up lines website for guys to search for pickup lines to use on girls. Anyways, my website has many, if anything a lot, of the same exact pick up lines as all my competitors are using. If I use the same pick up lines found on their site could i be penalized for this as far as SEO? thanks and hope to hear back
On-Page Optimization | | david3050 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0 -
Creating optimized content: how to standardize the process?
Hello there, we are creating the new content for a website. For each web page we have created a “Pages file” to have the advantage of the spell checker. For each page, in the “Pages file” we have written the title tag (70 characters) and the meta description (155 character), so we have a kind of “template” like this in every page: title tag meta desciption text content (included the alt of the images inside the text) Every page is optimized for a single keyword/keyword phrase. What we wanna know from you guys if does exist a kind of “best practice” to test keyword density to avoid keyword stuffing penalities. In our case we opted to use “Pages” as editor, does exist a “standard Numbers/Excel spreadsheet” to understand if a keyword is over optimized in a page and so might look spammy? And in your opinion guys, what’s the best way to standardize the process of creating optimized content? Take care and thank you in advance for sharing your experience. YESdesign guys.
On-Page Optimization | | YESdesign0