Googlebot on paywall made with cookies and local storage
-
My question is about paywalls made with cookies and local storage. We are changing a website with free content to a open paywall with a 5 article view weekly limit.
The paywall is made to work with cookies and local storage. The article views are stored to local storage but you have to have your cookies enabled so that you can read the free articles. If you don't have cookies enable we would pass an error page (otherwise the paywall would be easy to bypass).
Can you say how this affects SEO? We would still like that Google would index all article pages that it does now.
Would it be cloaking if we treated Googlebot differently so that when it does not have cookies enabled, it would still be able to index the page?
-
Thank you for your answer!
Yes, that is exactly the case.
We have been testing this and it seems that "Googlebot" doesn't hit the wall at all with the normal settings on. With these results it seems that we don't need to treat "Googlebot" differently because it doesn't seem to store any cookie or local storage data.
Tech savvy users can bypass the pay wall by other means as well so that's not a big concern for us.
-
To make sure that I'm getting your question correct. You want Google to crawl and index all your content but you want visitors to use an open paywall that shows 5 free articles then resorts to a paywall.
Yes, it would be treated as cloaking but you have a legitimate reason for doing so and intent matters a great deal. You could check for a search engine user-agent string such as "Googlebot" and then serve the full content. This would ensure that your content is still crawled and indexed.
The only downside is any tech savvy individual can spoof the server header by setting their user-agent to "Googlebot" and bypass your paywall.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Googlebot on steroids... Why?
We launched a new website (www.gelderlandgroep.com). The site contains 500 pages, but some pages (like https://www.gelderlandgroep.com/collectie/) contains filters (so there are a lot possible url parameters). Last week we mentioned a tremendous amount of traffic (25 GB!!) and CPU usage on the server. 2017-12-04 16:11:57 W3SVC66 IIS14 83.219.93.171 GET /collectie model=6511,6901,7780,7830,2105-illusion&ontwerper=henk-vos,foklab 443 - 66.249.76.153 HTTP/1.1 Mozilla/5.0+(Linux;+Android+6.0.1;+Nexus+5X+Build/MMB29P)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/41.0.2272.96+Mobile+Safari/537.36+(compatible;+Googlebot/2.1;++http://www.google.com/bot.html) - - www.gelderlandgroep.com 200 0 0 9445 501 312 We find out that "Googlebot" was firing many, many requests. At first we did a nslookup for the IPadres where it actually seems to be googlebot. Second we visited Google Searchconsole and I was really surprised... Googlebot on steroids? Googlebot requested 922.565 different url's and made combinations for every filter/ parameter combination on the site. Why? The sitemap.xml contains 500 url's... The authority of the site isn't very high, no other signal that this is a special website... Why so much "Google resources"? Of course we will exclude the parameters in SearchConsole, but I never saw a Googlebot activity for a small website like this before! Does anybody have any clue? Regards Olaf searchconsole.png nslookup.png
Intermediate & Advanced SEO | | Olaf0 -
Google Bot / SEO and Cookies
Hi, I'm trying to monetise my website via a paid subscription / paid content strategy. The plan is, after the user is on the website for 1min 30secs or clicks 3 map markers or visits 3+ pages A popup will appear asking for a signup + payment Cookie will be set for 6 months, if a users returns and cookie is detected (php) they will be redirected to sign up page My site relies heavily on organic SEO, so my question is: Will google bot be presented with this sign up stuff? Does google bot set cookies? will everything be indexed properly... And what affects on SEO?
Intermediate & Advanced SEO | | thinkLukeSEO0 -
Googlebot being redirected but not users?
Hi, We seem to have a slightly odd issue. We noticed that a number of our location category pages were slipping off 1 page, and onto page 2 in our niche. On inspection, we noticed that our Arizona page had started ranking in place of a number of other location pages - Cali, Idaho, NJ etc. Weirdly, the pages they had replaced were no longer indexed, and would remain so, despite being fetched, tweeted etc. One test was to see when the dropped out pages had been last crawled, or at least cached. When conducting the 'cache:domain.com/category/location' on these pages, we were getting 301 redirected to, you guessed it, the Arizona page. Very odd. However, the dropped out pages were serving 200 OK when run through header checker tools, screaming frog etc. On the face of it, it would seem Googlebot is getting redirected when it is hitting a number of our key location pages, but users are not. Has anyone experienced anything like this? The theming of the pages are quite different in terms of content, meta etc. Thanks.
Intermediate & Advanced SEO | | Sayers0 -
Local SEO - Do I need it if I don't do business locally?
Super confused about this. Our office is located in Los Angeles, but it is not a storefront, and our clients are from all over the country... and our business involves travel to other countries. So there is nothing "local" about us. But everything I read seems to say we should be doing local SEO. How to approach this?
Intermediate & Advanced SEO | | benenjerry1 -
After Receiving a "Googlebot can't access your site" would this stop your site from being crawled?
Hi Everyone,
Intermediate & Advanced SEO | | AMA-DataSet
A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.0 -
How come I get different rankings on same word in local search results of Google?
Dear fellow Mozzer's, for one of my clients I get different local results in Google. My client is a real-estate broker and when I search on "real-estate agent" + the city name we are on top. So whoohoo you would say BUT when Firefox has the exact city name determined as the location I am in and I only use "real-estate agent" I get also the local results but we are listed as number 8?? Hope anyone can give me insights as I have no idea what's causing this. Thanks in advance for your help!
Intermediate & Advanced SEO | | newtraffic0 -
Why does Google Claimed Local Listing Ranking Drop?
I have two local google places listinggs unlaimed. Both listings were ranking in the blended search in 7 pack. Once I claimed the local listings for the business both listings rankings have dropped. And one has totally vanished from the search rankings. Is this normal as it appears local places that are not claimed are ranking higher than local places claimed?
Intermediate & Advanced SEO | | VivaArturo0 -
Using exact keyword domains for local SEO
The website is for the attorney that serves several nearby cities. The main page is optimized for the biggest central city. I have several options how to go after the smaller surrounding cities: 1. Create optimized pages inside the main domain 2. Get more or less exact keyword domains for each city e.g. for the city ABC get yourABClawyer.com and then a) use 1 page websites that use the same template as main website and link all the menu items to the main website b)use 1 page website with a link "for more information go to our main website" c) point exact keyword domains to the optimized pages within the main domain. Which option would be the best in terms of SEO and user experience? Would people freak out if they click on the menu item and go to a different domain website even though it uses the same template (option 2a) Would I get more bounces with option 2b in your opinion? Would option 2c have any positive SEO effect? Should I not even bother with exact keyword domain and go with option 1?
Intermediate & Advanced SEO | | SirMax1