Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
-
Hi,
I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
Example: The games are not allowed in the USA, but they are allowed in Canada.Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
Objective / What we want to achieve:
The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India]If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages.Can i suggest how do we do this without getting done for cloaking etc?
Would this approach be ok? (please see below)
We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website.I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.
Sincere thanks,
-
o ensure SEO compliance while restricting access to certain countries, follow these 3 steps and keep in mind that these are critical to follow if you want to work on multinational and multilingual site:
Page Blackout for Restricted Visitors: Instead of redirecting users, blackout the content and display a message. For example, https://fifamobilefc.com/ shows a message to users from restricted countries while allowing Google to crawl the pages.
Implement Paywall Schema: Use paywall schema markup to signal to Google that content is restricted but not cloaked. This helps maintain transparency with search engines.
Geo-Targeting: Employ geo-targeting to identify and present the message to users from restricted countries, while still allowing Google to access the content.
By applying these methods, you can maintain SEO compliance while effectively restricting access to users from certain countries. Regular monitoring via Google Search Console ensures continued adherence to best practices.
-
@MarkCanning said in Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!:
Hi,
I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
Example: The games are not allowed in the USA, but they are allowed in Canada.
Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:
RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!
Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
Objective / What we want to achieve:
The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India]
If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages.
Can i suggest how do we do this without getting done for cloaking etc?
Would this approach be ok? (please see below)
We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website.
I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.By blacking out the page for visitors from restricted locations while allowing Googlebot access, you're ensuring compliance without hindering indexing. Implementing paywall schema can further clarify to Google that the restriction is based on licensing rather than cloaking. Just ensure consistent implementation across all restricted pages and adhere to Google's guidelines to avoid any issues.
-
@George_Inoriseo hi george, i submitted a previous reply on here but can't see it anywhere.
Firstly thank you for your feedback. I have some extra questions.
Lets assume we have a Canadian version of the website and a US human visitor tries to visit that site or any page on the site. They should be able to browse to the site but an overlay would appear meaning they cannot use the site or proceed any further. The overlap would say te site is restricted in their location. I see other companies doing this. What way would google handle this:
- Could they proceed to crawl the website or would the javascript overlap prevent Googlebot from crawling and indexeing?
- If googlebot where to look at the hash information of the page companred to the hash of what a user sees would they be the same? I believe if their is a big difference in the hash this is a signal for cloaking - because it shows the information / page size is substantially different.
- Would it be wise to avoid user agent lookups in the code? Again i believe this can signal to Google to Googl that manipulation is taking place.
I heard from a google offical that paywall schema might not be a great method.
"Paywall markup would not be suited here since there's no log-in or paymeny that can be done to get access when in the wrong country".Thanks
-
@George_Inoriseo thanks very much George.
The website will have a .com domain and then subfolders will branch off that for different countries / languages. So the structure would be like this:
domain.com
doman.com/en-ca (english Canada}
domain.com/fr-ca (french Canada)The company have licenses for certain countries and in countries where they don't have a license to operate (e.g. USA) users visiting our sites from those countries, should not be able to play. So on our Canadian website, if we detect a user is from USA (where we don't have a license) the user should get a message telling them they can't play. They should be able to visit the site ok, but the website would sniff the location and tell them that they can't play with the website blacked out.
As you suggested we could have a javascript overlay that loads if the user is from the USA. I assume this would only look at the geolocation and not the user agent? Looking up the user agent would be a clear sign we are doing something different for users and Googlebot would it not? Would an overlay restrict Googlebot from crawling the site and because the user is seeing something different to Googlebot could this be perceived as cloaking?
I spoke to someone at Google regarding paywall schema and the feeling was this: "paywall markup would not be suited since there is no log-in or payment that can be done to get access when in the wrong country".
Thanks again George.
-
@MarkCanning here is what I would do:
Avoid Redirects for Blocked Regions: Instead of redirecting users from blocked regions to a different page, use a client-side overlay (like a modal window) to display the restricted access message. This method keeps all users on the same URL.
Implement Paywall Schema: Applying the paywall schema is a smart move. It informs Google that your content restrictions are based on user location, not pay-to-access barriers, which helps avoid penalties for cloaking.
Ensure Accessible Content for Googlebot: Allow Googlebot to crawl the original content. Ensure that your site’s robots.txt file permits Googlebot to access the URLs of region-specific pages.
Use hreflang Tags for Multi-Region Sites: For multiple language and region versions, use hreflang tags to help Google understand the geographic and language targeting of your pages. This will also prevent duplicate content issues.
Monitor and Adapt: Keep an eye on Google Search Console to monitor how these changes affect your site's indexing and adjust your strategies as needed.
This strategy should help you manage SEO for restricted content effectively, while staying compliant with Google’s guidelines.
Best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving from single domain to multiple CCTLDs
Hi, I have a website targeting 3 markets (and therefor 3 languages). I was currently using a single domain with each market being targeted in the following format: www.website.com/pl
International SEO | | cellydy
www.website.com/de
www.website.com/hu It's clear to me by looking at organic results, that in my industry (Real Estate) Google is putting a large emphasis on local businesses and local domains. Top 10 organic results for all my keywords in all markets have country specific CCTLDs. I decided to migrate from a single domain strategy to a multi domain strategy. I own the domains. The new structure is www.website.com/pl -> www.website.pl
www.website.com/de -> www.website.de
www.website.com/hu -> www.website.hu All the website have been added to google search console and 301 redirects are in place and working correctly. The pages are all interlinked and have rel=alternate to each other. The sitemaps are all done correctly. My question is how do I tell Google about this. The change of address feature only works for changing one domain to one other domain. It's been a week and the old www.website.com domain is still showing up (even considering 301 redirects). Or do I just need to be patient and wait it out? Any tips?0 -
Unsolved Crawling only the Home of my website
Hello,
Product Support | | Azurius
I don't understand why MOZ crawl only the homepage of our webiste https://www.modelos-de-curriculum.com We add the website correctly, and we asked for crawling all the pages. But the tool find only the homepage. Why? We are testing the tool before to suscribe. But we need to be sure that the tool is working for our website. If you can please help us.0 -
Sponsor in flight ticket section
How can I be included in http://www.google.com/flights section in Serp for other languages like Persian e.g the keyword: بلیط هواپیما
International SEO | | fareli0 -
What is the best way to manage multiple international URLS
Hi All Our company is looking to expand into Europe (we are a UK based company) and we are planning to copy over our current .co.uk site to a .com one and create 301 redirects to maintain our SEO rankings. With the .com domain we were looking to use this as our main ecommerce site and then create sites for different countries in Europe. What we are unsure about is the best way to execute this in terms of the domain. Would it be best to have it setup as a domain structure such as: UK = www.example.com/gb/
International SEO | | MartinJC
Ireland = www.example.com/ie/
France – www.example.com/fr/ and so on. Or would we be better served creating sub domains for each country, example www.gb.example.com. Our main concerned is what is the best way to do this without hurting our SEO rankings. Thanks for the help.0 -
Why Google is not indexing each country/language subfolder on the ranks?
Hi folks, We use Magento 2 for the multi-country shops (its a multistore). The URL: www.avarcas.com The first days Google indexed the proper url in each country: avarcas.com/uk avarcas.com/de ... Some days later, all the countries are just indexing / (the root). I correctly set the subfolders in Webmaster tools. What's happening? Thanks
International SEO | | administratorwibee0 -
Working with country specific domain names vs. staying with .com
I've recently inherited a client that has a country specific domain for Canada (.ca) but there is also a US branch for the company at the .com address. They have a direct competitor that operates also in the U.S. and Canada that has decided to operate entirely under the .com address and re-direct all .ca traffic to their .com address. When I compare the link analysis data for both the .ca, .com, and competitors site, I'm finding there is a huge difference between the .ca site and the competitors site, but not a huge difference between the .com site and the competitors site. For example, the domain authorities are as follows: myclient.ca (Canadian branch) - 22 myclient.com (US branch) - 46 competitor.com - 53 When I do a brand search for my client in Canada, the Canadian branch website shows up first, but the American one is second. At this point, would it be better for my client to consolidate the two branches into the .com address and focus on increasing external followed links to the .com website? Or, is there merit in continuing to create a separate inbound link strategy for the .ca site? Thanks.
International SEO | | modernmusings0 -
Hotel Multi country targeting + Google Local + TLDs => "MesSEO"
Hi guys, I own a guesthouse which is facing a messy structural problem in its own web presence: Portuguese: www.residencia-aeminium-coimbra.com.pt English: www.residencia-aeminium-coimbra.com Spanish: www.residencia-aeminium-coimbra.com/espanol Looking for success in the long term, a few years ago, we decided to host 3 TLDs: 1 for global international english, 1 for local portuguese and 1 for the main foreign market Spain (we already redirected it to a subfolder in the .com in the meanwhile). We tried to promote each one of these in their targeted markets - mainly the .com got back-links links and authority. With time, google local appeared and changed the face of google accommodation rankings. Google local are now the top results for most searches, even outside the country, and I came to the conclusion that the TLD assets were now a problem. For instance, I can only add 1 domain to google local..what language should this be? It is the same for most social media presence..How should international markets be targeted in a local page? Since TLDs do not seem to be an advantage right now, I am thinking we should be moving to a single domain and use a folder structure so we can use it everywhere. Questions: 1. In terms of structure (TLD or Folder) and multi-lingual targeting what is the current best practice for hotels that show in local results but promote internationally? 2. What language should I point google local results to? And our Facebook page, etc? 3. If I move things around to a folder structure, what domain should I use? the .com is in english and has the most authority and links according to opensiteexplorer the .com.pt is in portuguese, our local language and our main market (but only 35% share) should I create a new domain ".pt"? 4. I don't think that geo-targeting the languages is worth it in this case, what do you think? Kind Regards
International SEO | | retsimister
Ricardo Madeira
Residencia Aeminium Coimbra0 -
Multi-lingual SEO: Country-specific TLD's, or migration to a huge .com site?
Dear SEOmoz team, I’m an in-house SEO looking after a number of sites in a competitive vertical. Right now we have our core example.com site translated into over thirty different languages, with each one sitting on its own country-specific TLD (so example.de, example.jp, example.es, example.co.kr etc…). Though we’re using a template system so that changes to the .com domain propagate across all languages, over the years things have become more complex in quite a few areas. For example, the level of analytics script hacks and filters we have created in order to channel users through to each language profile is now bordering on the epic. For a number of reasons we’ve recently been discussing the cost/benefit of migrating all of these languages into the single example.com domain. On first look this would appear to simplify things greatly; however I’m nervous about what effect this would have on our organic SE traffic. All these separate sites have cumulatively received years of on/off-site work, and even if we went through the process of setting up page-for-page redirects to their new home on example.com, I would hate to lose all this hard-work (and business) if we saw our rankings tank as a result of the move. So I guess the question is, for an international business such as ours, which is the optimal site structure in the eyes of the search engines; Local sites on local TLD’s, or one mammoth site with language identifiers in the URL path (or subdomains)? Is Google still so reliant on TLD for geo targeting search results, or is it less of a factor in today’s search engine environment? Cheers!
International SEO | | linklater0