Skip to content

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.


  • international seo crawling

    Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
    Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
    Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
    Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
    domain.com/en-ca [English Canada]
    domain.com/fr-ca [french Canada]
    domain.com/es-mx [spanish mexico]
    domain.com/pt-br [portugese brazil]
    domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
    However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
    However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
    While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,

    International SEO | | MarkCanning
    0
  • Unsolved

    crawl error crawling crawl

    Hello,
    I don't understand why MOZ crawl only the homepage of our webiste https://www.modelos-de-curriculum.com We add the website correctly, and we asked for crawling all the pages. But the tool find only the homepage. Why? We are testing the tool before to suscribe. But we need to be sure that the tool is working for our website. If you can please help us.

    Product Support | | Azurius
    0

  • indexing crawling noindex

    Our staging website got indexed by google and now MOZ is showing all inbound links from staging site, how should i remove those links and make it no index. Note- we already added Meta NOINDEX in head tag

    Intermediate & Advanced SEO | | Asmi-Ta
    0

  • indexing crawling

    Hi,
    Hope you all are doing great!
    I have created a dog blog a few weeks back which talks about all things about dogs (http://pawspulse.com/). I am publishing couple of articles everyday which are more than 5k words long with proper keyword research but still Google is not indexing my content. My content is systematically categorized in proper categories related to dog guides, nutrition, accessories, dog breeds etc.
    Can anyone help me how to get the website index faster fully. Any help will be much appreciated. Thanks

    Content Development | | Aman0022
    0

Looks like your connection to Moz was lost, please wait while we try to reconnect.