The Moz Q&A Forum

    • Forum
    • Questions
    • Users
    • Ask the Community

    Welcome to the Q&A Forum

    Browse the forum for helpful insights and fresh discussions about all things SEO.

    1. SEO and Digital Marketing Forum
    2. Categories
    3. SEO Tactics
    4. International SEO
    5. Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!

    Moz Q&A is closed.

    After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

    Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!

    International SEO
    international seocrawling
    6 4 577
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as question
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • MarkCanning
      MarkCanning last edited by

      Hi,

      I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
      Example: The games are not allowed in the USA, but they are allowed in Canada.

      Present Situation:
      Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:

      RESTRICTED LOCATION
      Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!

      Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.

      Objective / What we want to achieve:

      The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
      domain.com/en-ca [English Canada]
      domain.com/fr-ca [french Canada]
      domain.com/es-mx [spanish mexico]
      domain.com/pt-br [portugese brazil]
      domain.co.in/hi [hindi India]

      If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
      However we still want google to be able to access, crawl and index our pages.

      Can i suggest how do we do this without getting done for cloaking etc?

      Would this approach be ok? (please see below)

      We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
      However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
      While Googlebot would be allowed to visit and crawl the website.

      I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted

      Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.

      Sincere thanks,

      George_Inoriseo Robert_Ripple 2 Replies Last reply Reply Quote 0
      • JackBen4100
        JackBen4100 last edited by

        o ensure SEO compliance while restricting access to certain countries, follow these 3 steps and keep in mind that these are critical to follow if you want to work on multinational and multilingual site:

        Page Blackout for Restricted Visitors: Instead of redirecting users, blackout the content and display a message. For example, https://fifamobilefc.com/ shows a message to users from restricted countries while allowing Google to crawl the pages.

        Implement Paywall Schema: Use paywall schema markup to signal to Google that content is restricted but not cloaked. This helps maintain transparency with search engines.

        Geo-Targeting: Employ geo-targeting to identify and present the message to users from restricted countries, while still allowing Google to access the content.

        By applying these methods, you can maintain SEO compliance while effectively restricting access to users from certain countries. Regular monitoring via Google Search Console ensures continued adherence to best practices.

        1 Reply Last reply Reply Quote 0
        • Robert_Ripple
          Robert_Ripple @MarkCanning last edited by

          @MarkCanning said in Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!:

          Hi,
          I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
          Example: The games are not allowed in the USA, but they are allowed in Canada.
          Present Situation:
          Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:
          RESTRICTED LOCATION
          Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!
          Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
          Objective / What we want to achieve:
          The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
          domain.com/en-ca [English Canada]
          domain.com/fr-ca [french Canada]
          domain.com/es-mx [spanish mexico]
          domain.com/pt-br [portugese brazil]
          domain.co.in/hi [hindi India]
          If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
          However we still want google to be able to access, crawl and index our pages.
          Can i suggest how do we do this without getting done for cloaking etc?
          Would this approach be ok? (please see below)
          We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
          However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
          While Googlebot would be allowed to visit and crawl the website.
          I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
          Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.

          By blacking out the page for visitors from restricted locations while allowing Googlebot access, you're ensuring compliance without hindering indexing. Implementing paywall schema can further clarify to Google that the restriction is based on licensing rather than cloaking. Just ensure consistent implementation across all restricted pages and adhere to Google's guidelines to avoid any issues.

          1 Reply Last reply Reply Quote 0
          • MarkCanning
            MarkCanning @George_Inoriseo last edited by

            @George_Inoriseo hi george, i submitted a previous reply on here but can't see it anywhere.

            Firstly thank you for your feedback. I have some extra questions.

            Lets assume we have a Canadian version of the website and a US human visitor tries to visit that site or any page on the site. They should be able to browse to the site but an overlay would appear meaning they cannot use the site or proceed any further. The overlap would say te site is restricted in their location. I see other companies doing this. What way would google handle this:

            1. Could they proceed to crawl the website or would the javascript overlap prevent Googlebot from crawling and indexeing?
            2. If googlebot where to look at the hash information of the page companred to the hash of what a user sees would they be the same? I believe if their is a big difference in the hash this is a signal for cloaking - because it shows the information / page size is substantially different.
            3. Would it be wise to avoid user agent lookups in the code? Again i believe this can signal to Google to Googl that manipulation is taking place.

            I heard from a google offical that paywall schema might not be a great method.
            "Paywall markup would not be suited here since there's no log-in or paymeny that can be done to get access when in the wrong country".

            Thanks

            1 Reply Last reply Reply Quote 0
            • MarkCanning
              MarkCanning @George_Inoriseo last edited by

              @George_Inoriseo thanks very much George.

              The website will have a .com domain and then subfolders will branch off that for different countries / languages. So the structure would be like this:

              domain.com
              doman.com/en-ca (english Canada}
              domain.com/fr-ca (french Canada)

              The company have licenses for certain countries and in countries where they don't have a license to operate (e.g. USA) users visiting our sites from those countries, should not be able to play. So on our Canadian website, if we detect a user is from USA (where we don't have a license) the user should get a message telling them they can't play. They should be able to visit the site ok, but the website would sniff the location and tell them that they can't play with the website blacked out.

              As you suggested we could have a javascript overlay that loads if the user is from the USA. I assume this would only look at the geolocation and not the user agent? Looking up the user agent would be a clear sign we are doing something different for users and Googlebot would it not? Would an overlay restrict Googlebot from crawling the site and because the user is seeing something different to Googlebot could this be perceived as cloaking?

              I spoke to someone at Google regarding paywall schema and the feeling was this: "paywall markup would not be suited since there is no log-in or payment that can be done to get access when in the wrong country".

              Thanks again George.

              1 Reply Last reply Reply Quote 0
              • George_Inoriseo
                George_Inoriseo @MarkCanning last edited by

                @MarkCanning here is what I would do:

                Avoid Redirects for Blocked Regions: Instead of redirecting users from blocked regions to a different page, use a client-side overlay (like a modal window) to display the restricted access message. This method keeps all users on the same URL.

                Implement Paywall Schema: Applying the paywall schema is a smart move. It informs Google that your content restrictions are based on user location, not pay-to-access barriers, which helps avoid penalties for cloaking.

                Ensure Accessible Content for Googlebot: Allow Googlebot to crawl the original content. Ensure that your site’s robots.txt file permits Googlebot to access the URLs of region-specific pages.

                Use hreflang Tags for Multi-Region Sites: For multiple language and region versions, use hreflang tags to help Google understand the geographic and language targeting of your pages. This will also prevent duplicate content issues.

                Monitor and Adapt: Keep an eye on Google Search Console to monitor how these changes affect your site's indexing and adjust your strategies as needed.

                This strategy should help you manage SEO for restricted content effectively, while staying compliant with Google’s guidelines.

                Best of luck!

                MarkCanning 2 Replies Last reply Reply Quote 0
                • 1 / 1
                • First post
                  Last post

                Got a burning SEO question?

                Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                Start my free trial


                Explore more categories

                • Moz Tools

                  Chat with the community about the Moz tools.

                  Getting Started
                  Moz Pro
                  Moz Local
                  Moz Bar
                  API
                  What's New

                • SEO Tactics

                  Discuss the SEO process with fellow marketers

                  Content Development
                  Competitive Research
                  Keyword Research
                  Link Building
                  On-Page Optimization
                  Technical SEO
                  Reporting & Analytics
                  Intermediate & Advanced SEO
                  Image & Video Optimization
                  International SEO
                  Local SEO

                • Community

                  Discuss industry events, jobs, and news!

                  Moz Blog
                  Moz News
                  Industry News
                  Jobs and Opportunities
                  SEO Learn Center
                  Whiteboard Friday

                • Digital Marketing

                  Chat about tactics outside of SEO

                  Affiliate Marketing
                  Branding
                  Conversion Rate Optimization
                  Web Design
                  Paid Search Marketing
                  Social Media

                • Research & Trends

                  Dive into research and trends in the search industry.

                  SERP Trends
                  Search Behavior
                  Algorithm Updates
                  White Hat / Black Hat SEO
                  Other SEO Tools

                • Support

                  Connect on product support and feature requests.

                  Product Support
                  Feature Requests
                  Participate in User Research

                • See all categories

                • Moving from single domain to multiple CCTLDs
                  cellydy
                  cellydy
                  0
                  7
                  1.8k

                • Crawling only the Home of my website
                  Azurius
                  Azurius
                  0
                  7
                  1.8k

                • Staging website got indexed by google
                  Asmi-Ta
                  Asmi-Ta
                  0
                  10
                  2.6k

                • Website is not getting indexed
                  Aman0022
                  Aman0022
                  0
                  30
                  4.8k

                • How to rank a website in different countries
                  Ravi_Rana
                  Ravi_Rana
                  0
                  10
                  4.7k

                • Multilang site: Auto redirect 301 or 302?
                  fJ66doneOIdDpj
                  fJ66doneOIdDpj
                  0
                  5
                  5.2k

                • "Duplicate without user-selected canonical” - impact to SERPs
                  Alex_Pisa
                  Alex_Pisa
                  0
                  6
                  6.7k

                • Do non-english(localized) URLs help Local SEO and user experience?
                  Amjath
                  Amjath
                  0
                  7
                  14.8k

                Get started with Moz Pro!

                Unlock the power of advanced SEO tools and data-driven insights.

                Start my free trial
                Products
                • Moz Pro
                • Moz Local
                • Moz API
                • Moz Data
                • STAT
                • Product Updates
                Moz Solutions
                • SMB Solutions
                • Agency Solutions
                • Enterprise Solutions
                • Digital Marketers
                Free SEO Tools
                • Domain Authority Checker
                • Link Explorer
                • Keyword Explorer
                • Competitive Research
                • Brand Authority Checker
                • Local Citation Checker
                • MozBar Extension
                • MozCast
                Resources
                • Blog
                • SEO Learning Center
                • Help Hub
                • Beginner's Guide to SEO
                • How-to Guides
                • Moz Academy
                • API Docs
                About Moz
                • About
                • Team
                • Careers
                • Contact
                Why Moz
                • Case Studies
                • Testimonials
                Get Involved
                • Become an Affiliate
                • MozCon
                • Webinars
                • Practical Marketer Series
                • MozPod
                Connect with us

                Contact the Help team

                Join our newsletter
                Moz logo
                © 2021 - 2026 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                • Accessibility
                • Terms of Use
                • Privacy

                Looks like your connection to Moz was lost, please wait while we try to reconnect.