Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blocking domestic Google's in Robots.txt
-
Hey,
I want to block Google.co.uk from crawling a site but want Google.de to crawl it.
I know how to configure the Robots.txt to block Google and other engines - is there a fix to block certain domestic crawlers?
any ideas?
Thanks
B
-
Thanks Guys for all of the help.
I think we will just implement cross domain GeoIP redirects to ensure users get the right location and currency.
Cheers
-
Are you having the issue of your .de pages ranking in .co.uk instead of your .co.uk pages?
If that's the case then I'd look towards usage of HREFLANG both on-page and in the xml sitemaps. That is going to provided Googlebot with a better view of the country-language targeting for the site.
-
Hi, country specific search engine spiders cannot be blocked using robots.txt file or any other method. However, you can block certain IP ranges pertaining to certain countries.
Best regards,
Devanur Rafi
-
Hi Gareth,
I don't think this is going to work as every crawler by Google is run by the same useragent: Googlebot. What you could do but what I really wouldn't recommend to do is generating the robots.txt automatically. Check if the IP address of the user is in another country and then Disallow them. It probably won't work as the crawler from let's say Germany could also be used for the UK.
Also the specific data for a countries search engine gets collected first and then gets looked at to see what they need users to serve, not the other way around that content gets acquired for a specific country.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
Has any one seen negative SEO effects from using Google Translate API
We have a site currently in development that is using the Google Translate API and I am having a massive issue getting screaming frog to crawl and all of our non-native English speaking employees have read through the translated copy in their native language and the general consensus is it reads at a 5th grade level at best. My questions to the community is, has anyone implemented this API on a site and has it a) helped with gaining traffic from other languages/countires and b) has it hurt there site from an SEO standpoint.
International SEO | | VERBInteractive0 -
Google.ie returning more and more UK based results, why?
I have discovered the most infuriating issue with Google Search for Irish users and it seems to be getting increasingly worse in the last 2 years or so. This is not only frustrating as a business owner (in fact it could bring a business to its knees) but it is rage inducing as a consumer.
International SEO | | Secrets
Google knows the location where I am searching from and I'm using google.ie yet I still get just a small number of Irish websites usually followed by eBay and Amazon results then a never ending list of websites that are based in the United Kingdom. Now, I know the one thing that we all have in common is the use of the English language, however what we don't have in common is shipping costs. In order to slightly increase the number of Irish based companies I need to add in the phrase 'Ireland' to my search (on google.ie in Ireland) and this makes only a small difference. In fact, oftentimes Google seems to throw in the odd American or Australian site just to really wind me up.
It's completely absurd that Google rarely returns results for .ie websites or irish based websites when searching in Ireland. Many UK companies don't ship to Ireland (including many of the eBay and Amazon results). This is killing Irish businesses who have the products and cheaper or free shipping and many how are working damn hard on their SEO are still being passed up for companies that have nothing to do with our economy.... Why oh why is this happening.0 -
How can I change the currency Google lists my products with in the SERP?
I.e. This product - http://www.absoluteautomation.ca/fgd400-sensaphone400-p/fgd400.htm - shows up as USD in the SERP. (In the US it just won't show a currency, if Canada it will show USD on the SERP). My pricing is all in CAD, how can I tell Google this? (It knows pricing is CAD in my Google Product Listings/Merchant Center). Thanks!
International SEO | | absoauto0 -
Does Google's algorithm work the same in every country?
I can't help but feel this is a silly question! but does Google algorithm work exactly the same throughout all countries? I run a few sites in the UK and a couple in Spain but can't help but feel that my Spanish sites are harder to rank for. The sites that rank the best are business directories in Spain... whereas here in the UK you'd be lucky to find one on page one..
International SEO | | david.smith.segarra0 -
Thai Characters in URL's
Does anyone have experience with non-Latin characters in URL's? We've launched a website in Thailand and picked Thai characters for URL's. However, when you copy it, it turns into something like this: http://www.imoneythailand.com/บัตรเครดิต Can it impact our website's crawlability? Also, is keyword in URL a ranking factor for non-Latin languages? Thanks in advance for help!
International SEO | | imoney0 -
How to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries?
Dear all, what is the best way to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries? What must I add to my code of websites my .nl domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? What must I add to my code of websites my .be domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? Thanks in advance!
International SEO | | HMK-NL3 -
Non US site pages indexed in US Google search
Hi, We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results. We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure? Below are examples of two of our URLs for reference - one from Canada, the other from the US /ca/en/prod4130078/2500058/catalog50008/ /us/en/prod4130078/2500058/catalog20038/ If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem? Thank you. Angie
International SEO | | Corel0