Indexation of content from internal pages (registration) by Google
-
Hello,
we are having quite a big amount of content on internal pages which can only be accessed as a registered member.
What are the different options the get this content indexed by Google?
In certain cases we might be able to show a preview to visitors. In other cases this is not possible for legal reasons.
Somebody told me that there is an option to send the content of pages directly to google for indexation. Unfortunately he couldn't give me more details. I only know that this possible for URLs (sitemap). Is there really a possibility to do this for the entire content of a page without giving google access to crawl this page?
Thanks
Ben
-
The issue is that Google won't and shouldn't index pages that are restricted.
This is best for user experience. Most people won't sign in to view the content.
You basically have to create two sites. One that is visible to all, and Google where you show or preview a bit. then the other that is protected.
-
Thanks, I will check wether this meets the legal requirements (see my reply to Brents answer).
-
As I mentioned we have 2 cases.
In the first case, we can show a preview.
In the second case we can only show the content to a certain audience (which is a legal question). So the free registration is a legal requirement. Still people will be looking for it via Google. Since the content found on those pages is useful for a fairly large audience so why wouldn't we want Google to index the pages. Of course without Google knowing that there is relevant content on those pages, they will neither index nor propperly rank those pages.
-
I found this information for you but you should definitely check that it doesn't break any of Google's guidelines before incorporating it to your website.
This is a simple code to allow bots to bypass the password on password protected pages
$allow_inside = ($is_logged_in) || substr_count($_SERVER['HTTP_USER_AGENT'],'Googlebot');
http://davidwalsh.name/google-password-protected-areas
The reference post is older, so this code could have been updated
-
Fetch as Google Bot is submit to index. This is why I believe it should work with it.
-
I guess my questions is why would you want Google to index something that is only available to registered users?
In order for it to be indexed, it has to be open to everyone.
You will have to figure out what can be shown as a preview and what can't. If you want something to be indexed, then you will have to create a separate section for your preview content (since Google won't index your protected content.)
-
Hi Istvan,
"The Fetch as Googlebot tool lets you see a page as Googlebot sees it."
Since Googlebot has no access to the entire site (login required) it will probably not display anything (just tried it logged in and it would not display any of the content). How could this tool theoretically help us indexing the content of the internal page?
Ben
-
Hi Ben,
Maybe fetch as Google Bot can be a solution to your issue. But not 100% sure of this.
Gr.,
Istvan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Authority Dropped and Indexed Pages Went Down on Google?
Hi there, We run an e-commerce site on Shopify. Our Domain Authority was 28 at the start of our campaign in May of this year. We also had 610 indexed pages on Google. We did some SEO work which included: Renaming Images for SEO Adding in alt tags Optimizing the meta title to "Product Name - Keyword - Brand Name" for products Optimizing meta descriptions Transition of Hubspot blog to Shopify (it was on a subdomain at Hubspot previously) Fixing some 404s Resubmitting site map after the changes Now it is almost at the 3-month mark and it looks like our Domain Authority has gone down 4 points to 24. The # of indexed pages has gone to down to 555. We made sure all our SEO updates weren't spammy or keyword-stuffed, but took a natural and helpful-sounding approach. We followed guidelines. So there shouldn't be any penalty right? I checked site traffic and it does not coincide with the drop. Our site traffic remains steady. I also looked at "site:" as well as conducted some test searches for the important pages (i.e. main pages, blog pages, and product pages) and they still come up on Google. So could it only be non-important pages being deindexed? My questions are: Why did both the Domain Authority and # of indexed pages go down? Is there any way to see which pages were deindexed? I checked Google Search Console, but couldn't find it. Thank you!
Intermediate & Advanced SEO | | kindalpaca70 -
Stuck on the 2nd page of google! Help
I run a McAfee Technical Support website. I has been 2.3 months since I have been practicing seo on it. It was slick until it appeared on the second page of google. But now it doesnt rank up as it's frozen. Can i get any advices and suggestions for my website to break the 2nd page cage. My website:-** mcafee.com/activate**
Intermediate & Advanced SEO | | six_figures0 -
Different snippet in Google for same page
Hello, I have a question regarding the snippet of a specific case: When i search the homepage by searching the business name, i find the correct snippet of the homepage (with the meta description that was entered). If i search it via site:www. it still show the default meta description. Has anybody had experience with this? Is there a way to change the snippet of site:www.? Does it influence SEO? Thank you!
Intermediate & Advanced SEO | | conversal0 -
Page Indexed but not Cached
A section of pages on my site are indexed (I know because they appear in SERPs if I copy and paste a sentence from the content), however according to the text-only cached version of the page they are not being read by Google.Why are they indexed event hough it seems like Google is not reading them..... or is Google in fact reading this text even though it seems like they should not be?Thanks for your assistance.
Intermediate & Advanced SEO | | theLotter0 -
What is a good content for google?
When we start to study SEO and how google see our webpage, one important point is to have good content. But, for beginners like me, we get lost on this. Is not so black and white: what for you is a good content? the text amount matters? there is any trick that all good content websites need to have?
Intermediate & Advanced SEO | | Naghirniac0 -
Why duplicate content for same page?
Hi, My SEOMOZ crawl diagnostic warn me about duplicate content. However, to me the content is not duplicated. For instance it would give me something like: (URLs/Internal Links/External Links/Page Authority/Linking Root Domains) http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110516 /1/1/31/2 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110711 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110811 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110911 0/0/1/0 Why is this seen as duplicate content when it is only URL with campaign tracking codes to the same content? Do I need to clean this?Thanks for answer
Intermediate & Advanced SEO | | nuxeo0 -
Redirect of just one internal page
If I have 2 domains with different content that are in same topic, and each one lives on its own IP-address, what could be the result if I do permanent redirect of just one internal page from one domain to counterpart page of another? What if I use rel=canonical instead of R301? Thank you!
Intermediate & Advanced SEO | | kolio_kolev0 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610