Can Google read content/see links on subscription sites?
-
If an article is published on The Times (for example), can Google by-pass the subscription sign-in to read the content and index the links in the article?
Example: http://www.thetimes.co.uk/tto/life/property/overseas/article4245346.ece
In the above article there is a link to the resort's website but you can't see this unless you subscribe. I checked the source code of the page with the subscription prompt present and the link isn't there.
Is there a way that these sites deal with search engines differently to other user agents to allow the content to be crawled and indexed?
-
Hey Matt,
The best way to tell what the news organization or site is using is to turn off javascript or view the google cache to determine how Google "sees" the page.
This article is using the second option in the article I mentioned - snippets. Here is what the article has to say about that:
"If you prefer this option, please display a snippet of your article that is at least 80 words long and includes either an excerpt or a summary of the specific article." -
Thanks Dan, it doesn't look like the example article is using first click free. So I guess the answer is no, Google can't read the hidden content in this example?
-
Great question! Yes, Google has an effective way to deal with this since 2007. The three ways they deal with this include first click free, subscription designation, and then disallowing content. Here is their official support article on it:
https://support.google.com/news/publisher/answer/40543?hl=en
Here is a quote from the help article:
"To summarize, we will crawl and index your site to the extent that you allow Googlebot to access it. In order to provide the best possible user experience and help more users discover your content, we encourage you to try First Click Free. If you prefer to limit access to your site to subscribers only, we will respect your decision and show a “subscription” label next to your links on Google News."Here is what Matt Cutts said about it in an interview with Search Engine Land:
"First Click Free originated with Google News, but you can use the same way of handling content in web search (show the same page to users and Googlebot, then if the user clicks to read a different article, then you can show them the registration or pay page). Because the same page is presented to users and to Googlebot, it’s not cloaking. So First Click Free is a great way if you have premium content to surface it in Google’s web index without cloaking. Hope that makes sense."It is possible to allow the Googlebot to access the content and simultaneously NOT provide it for free to non-subscribers. The above help article above should answer all of your questions. Hope this helps!
-
I would say no. The content of the article other than what is seen is not in the source code. They could be showing something different to Google, but if they did it would be against Google's terms of service. https://support.google.com/webmasters/answer/66355?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
Site Redesign - Inbound Links
Hello all. What would be some of the best practices or good resources on site redesign while maintaining inbound links? We would hate to have the natural, organic links to the site we have generated over the past 3 years to all of a sudden become broken. The domain is not changing but the URL structure very well may. For example, www.domain dot com/blog/postabouttopic which has many inbound links may move to www.domain dot com/news/blog/postabouttopic Is it a matter of simply using 301 redirects from the old pages to the new pages? Is there any issues to be aware of when having hundreds of 301 redirects? Is there a best practice? A good site that explains this in detail? Thank you for your time! Have a great day!
Intermediate & Advanced SEO | | S2RSolutions0 -
Main content - javascript/ajax
Hi, On most of our pages Javascript is displaying our main content, so it doesn't show up on the page source and I assume isn't being crawled by Google to the best of its ability. It's also not showing up on MOZ's page grader and crawl results, making analysis and testing harder. What's the easiest way, without having to completely redo our website, to have this content crawled by search engines and moz?
Intermediate & Advanced SEO | | S.S.N0 -
Site wide links Concept
Hi All, All type of site wide links are bad for Google or it depends upon other factors as well? For example if you talk about GoDaddy or any other service provider company they put their links on the footer of other websites so in this condition, Google will harm their rankings or not? Also elaborate the best practices for site wide links.
Intermediate & Advanced SEO | | RuchiPardal0 -
Link juice site structure?
If we have a top nav with contact us, about us, delivery, FAQ, Gallery, how to order ect but none of these we want to rank and then we have the usual left hand nav.are we wasting juice with the top nav and would we be better either removing it and putting them further down the page or consolidating them and adding an extra products tab so the product pages are first.
Intermediate & Advanced SEO | | BobAnderson0 -
Should I link my similar sites together?
Hi I currently have two sites within exactly the same market. I've just purchased a third website from someone. Should I link these sites together? (i.e. in the page header should I cross link them or point two of them to the third?) If I do this will it harm them if they are on the same C-Class IP blocks? Is using private domains and different hosting companies considered dodgey in any way? Basically I'm a big wimp and don't want to do anything potentially that might potentially hurt my rankings;)
Intermediate & Advanced SEO | | Blendfish0 -
Linking to bad sites
Hi, I just have a quick question. Is it very negative to link to "bad" sites, such as online pharmacies, dating, adult sites, that sort of stuff? How much does linking to a "bad" site negatively affect a "good" site? Thank you.
Intermediate & Advanced SEO | | salvyy0