Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Googlebot on paywall made with cookies and local storage
-
My question is about paywalls made with cookies and local storage. We are changing a website with free content to a open paywall with a 5 article view weekly limit.
The paywall is made to work with cookies and local storage. The article views are stored to local storage but you have to have your cookies enabled so that you can read the free articles. If you don't have cookies enable we would pass an error page (otherwise the paywall would be easy to bypass).
Can you say how this affects SEO? We would still like that Google would index all article pages that it does now.
Would it be cloaking if we treated Googlebot differently so that when it does not have cookies enabled, it would still be able to index the page?
-
Thank you for your answer!
Yes, that is exactly the case.
We have been testing this and it seems that "Googlebot" doesn't hit the wall at all with the normal settings on. With these results it seems that we don't need to treat "Googlebot" differently because it doesn't seem to store any cookie or local storage data.
Tech savvy users can bypass the pay wall by other means as well so that's not a big concern for us.
-
To make sure that I'm getting your question correct. You want Google to crawl and index all your content but you want visitors to use an open paywall that shows 5 free articles then resorts to a paywall.
Yes, it would be treated as cloaking but you have a legitimate reason for doing so and intent matters a great deal. You could check for a search engine user-agent string such as "Googlebot" and then serve the full content. This would ensure that your content is still crawled and indexed.
The only downside is any tech savvy individual can spoof the server header by setting their user-agent to "Googlebot" and bypass your paywall.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I need help in doing Local SEO
Hey guys I hope everyone is doing well. I am new to SEO world and I want to do local SEO for one of my clients. The issue is I do not know how to do Local SEO at all or where to even start. I would appreciate it if anyone could help me or give me an article or a course to learn how to do it. Main question The thing that I want to do is that, I want my website to show up in top 3 google map results for different locations(which there is one actual location). For example I want to show up for
Intermediate & Advanced SEO | | seopack.org.ofici3
online clothing store in new york
online clothing store in los angeles or... Let's assume that we can ship our product to every other cities. So I hope I could deliver what I mean. I'd appreciate it if you could answer me with practical solutions.0 -
Local SEO - ranking the same page for multiple locations
Hi everyone, I am aware that issue of local SEO has been approached numerous times, but the situation that I'm dealing with is slightly different, so I'd love to receive your expert advice. I'm running the website of a property management company which services multiple locations (www.homevault.com). From our local offices in the city center, we also service neighboring towns and communities ( ex: we have an office in Charlotte NC, from which we service Charlotte plus a dozen other towns nearby). We wanted to avoid creating dozens of extra local service pages, particularly since our offers are identical per metropolitan area and we're talking of 20-30 additional local pages for each area. Instead, we decided to create local service pages only for the main locations. Needless to say, we're now ranking for the main locations, but we're missing on all searches for property management in neighboring towns (we're doing good on searches such as 'charlotte property management', but we're practically invisible for 'davidson property management', although we're searvicing that area as well). What we've done so far to try and fix the situation: 1. The current location pages do include descriptions of areas that we serve. 2. We've included 1-2 keywords for the sattelite locations in the main location pages, but we're nowhere near the optimization needed to rank for local searches in neighboring towns (ie, some main local service pages rank on pages 2-4 for sattelite towns, so not good enough). 3. We've included the searviced areas in our local GMBs, directories, social media profiles etc. None of these solutions appear to work great. Should I go ahead and create the classic local pages for each and every town and optimize them on those particular keywords, even if the offer is practically the same, and the number of pages risks going out of control? Any other better ideas? Many thanks in advance!
Intermediate & Advanced SEO | | HomeVaultPM0 -
Mobile Googlebot vs Desktop Googlebot - GWT reports - Crawl errors
Hi Everyone, I have a very specific SEO question. I am doing a site audit and one of the crawl reports is showing tons of 404's for the "smartphone" bot and with very recent crawl dates. If our website is responsive, and we do not have a mobile version of the website I do not understand why the desktop report version has tons of 404's and yet the smartphone does not. I think I am not understanding something conceptually. I think it has something to do with this little message in the Mobile crawl report. "Errors that occurred only when your site was crawled by Googlebot (errors didn't appear for desktop)." If I understand correctly, the "smartphone" report will only show URL's that are not on the desktop report. Is this correct?
Intermediate & Advanced SEO | | Carla_Dawson0 -
Would you rate-control Googlebot? How much crawling is too much crawling?
One of our sites is very large - over 500M pages. Google has indexed 1/8th of the site - and they tend to crawl between 800k and 1M pages per day. A few times a year, Google will significantly increase their crawl rate - overnight hitting 2M pages per day or more. This creates big problems for us, because at 1M pages per day Google is consuming 70% of our API capacity, and the API overall is at 90% capacity. At 2M pages per day, 20% of our page requests are 500 errors. I've lobbied for an investment / overhaul of the API configuration to allow for more Google bandwidth without compromising user experience. My tech team counters that it's a wasted investment - as Google will crawl to our capacity whatever that capacity is. Questions to Enterprise SEOs: *Is there any validity to the tech team's claim? I thought Google's crawl rate was based on a combination of PageRank and the frequency of page updates. This indicates there is some upper limit - which we perhaps haven't reached - but which would stabilize once reached. *We've asked Google to rate-limit our crawl rate in the past. Is that harmful? I've always looked at a robust crawl rate as a good problem to have. Is 1.5M Googlebot API calls a day desirable, or something any reasonable Enterprise SEO would seek to throttle back? *What about setting a longer refresh rate in the sitemaps? Would that reduce the daily crawl demand? We could set increase it to a month, but at 500M pages Google could still have a ball at the 2M pages/day rate. Thanks
Intermediate & Advanced SEO | | lzhao0 -
Why would our server return a 301 status code when Googlebot visits from one IP, but a 200 from a different IP?
I have begun a daily process of analyzing a site's Web server log files and have noticed something that seems odd. There are several IP addresses from which Googlebot crawls that our server returns a 301 status code for every request, consistently, day after day. In nearly all cases, these are not URLs that should 301. When Googlebot visits from other IP addresses, the exact same pages are returned with a 200 status code. Is this normal? If so, why? If not, why not? I am concerned that our server returning an inaccurate status code is interfering with the site being effectively crawled as quickly and as often as it might be if this weren't happening. Thanks guys!
Intermediate & Advanced SEO | | danatanseo0 -
Ranking for local searches without city specific keywords?
Hey guys! I had asked this question a few months ago and now that we are seeing even more implicit information determining search results, I want to ask it again..in two parts. Is is STILL best practice for on-page to add the city name to your titles, h1s, content etc? It seems that this will eventually be an outdated tactic, right? If there is a decent amount of search volume without any city name in the search query (ie. "storefont signs", but no search volume for the phrase when specific cities are added (ie. "storefront signs west palm beach) is it worth trying to rank and optimize for that search term for a company in West Palm Beach? We can assume that if there are 20,000 monthly searches for the non-location specific term that SOME of them would be fairly local, so do we optimize the page without the city name and trust Google to display results with a local intent...therefore showing our client's site in the SERPS when someone searches "sign company" and they are IN West Palm Beach? If there is any confusion, please just ask me to clarify! I think this would be a great WhiteBoard Friday topic for Rand!
Intermediate & Advanced SEO | | RickyShockley0 -
Will changing a subdirectory name negatively effect local ranking?
We submitted a group of 50+ franchise stores into UBL to fulfill directory listings back in September. We are now looking at changing the some of the URL structure to include city names. Example: website.com/store/store-name(not city) to website.com/location/city-store-name Will changing the subdirectory and resubmitting to the directory aggregators negatively effect their search results? Thanks, Jake
Intermediate & Advanced SEO | | AESEO0 -
800 Number vs. Local Phone
I have a client with multiple locations throughout the US. They are currently using different 800 numbers on their site for their different locations. As they try to optimize their local presence but submitting to local directories, we are trying to determine two things: Does having a local number reroute to an 800 number devalue the significance of it being a local number (I've never heard of this, but someone told them it did) Locality and consistency are important. Assuming they can't remove the 800 numbers from the site, are they better off keeping the 800 numbers on their site and using local numbers every else online OR just using the 800 numbers for all of their local listings?
Intermediate & Advanced SEO | | Caleone0