Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blocking domestic Google's in Robots.txt
-
Hey,
I want to block Google.co.uk from crawling a site but want Google.de to crawl it.
I know how to configure the Robots.txt to block Google and other engines - is there a fix to block certain domestic crawlers?
any ideas?
Thanks
B
-
Thanks Guys for all of the help.
I think we will just implement cross domain GeoIP redirects to ensure users get the right location and currency.
Cheers
-
Are you having the issue of your .de pages ranking in .co.uk instead of your .co.uk pages?
If that's the case then I'd look towards usage of HREFLANG both on-page and in the xml sitemaps. That is going to provided Googlebot with a better view of the country-language targeting for the site.
-
Hi, country specific search engine spiders cannot be blocked using robots.txt file or any other method. However, you can block certain IP ranges pertaining to certain countries.
Best regards,
Devanur Rafi
-
Hi Gareth,
I don't think this is going to work as every crawler by Google is run by the same useragent: Googlebot. What you could do but what I really wouldn't recommend to do is generating the robots.txt automatically. Check if the IP address of the user is in another country and then Disallow them. It probably won't work as the crawler from let's say Germany could also be used for the UK.
Also the specific data for a countries search engine gets collected first and then gets looked at to see what they need users to serve, not the other way around that content gets acquired for a specific country.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to check Robots.txt File
I want to know that how we are going to check that the Robots.txt File of the website is working properly. Kindly elaborate the mechanism for it please.
International SEO | | seobac1 -
Redirect to 'default' or English (/en) version of site?
Hi Moz Community! I'm trying to work through a thorny internationalization issue with the 'default' and English versions of our site. We have an international set-up of: www.domain.com (in english) www.domain.com/en www.domain.com/en-gb www.domain.com/fr-fr www.domain.com/de-de and so on... All the canonicals and HREFLANGs are set up, except the English language version is giving me pause. If you visit www.domain.com, all of the internal links on that page (due to the current way our cms works) point to www.domain.com/en/ versions of the pages. Content is identical between the two versions. The canonical on, say, www.domain.com/en/products points to www.domain.com/products. Feels like we're pulling in two different directions with our internationalization signals. Links go one way, canonical goes another. Three options I can see: Remove the /en/ version of the site. 301 all the /en versions of pages to /. Update the hreflangs to point the EN language users to the / version. **Redirect the / version of the site to /en. **The reverse of the above. **Keep both the /en and the / versions, update the links on / version. **Make it so that visitors to the / version of the site follow links that don't take them to the /en site. It feels like the /en version of the site is redundant and potentially sending confusing signals to search engines (it's currently a bit of a toss-up as to which version of a page ranks). I'm leaning toward removing the /en version and redirecting to the / version. It would be a big step as currently - due to the internal linking - about 40% of our traffic goes through the /en path. Anything to be aware of? Any recommendations or advice would be much appreciated.
International SEO | | MaxSydenham0 -
In the U.S., how can I stop the European version of my site from outranking the U.S. version?
I've got a site with two versions – a U.S. version and a European version. Users are directed to the appropriate version through a landing page that asks where they're located; both sites are on the same domain, except one is .com/us and the other is .com/eu. My issue is that for some keywords, the European version is outranking the U.S. version in Google's U.S. SERPs. Not only that, but when Google displays sitelinks in the U.S. SERPs, it's a combination of pages on the European site and the U.S. site. Does anyone know how I can stop the European site from outranking the U.S. site in the U.S.? Or how I can get Google to only display sitelinks for pages on the U.S. site in the U.S. SERPs? Thanks in advance for any light you can shed on this topic!
International SEO | | matt-145670 -
Problems with the google cache version of different domains.
We have problems with the google cache version of different domains.
International SEO | | Humix
For the “.nl” domain we have an “.be” cache..
Enter “cache:www.dmlights.nl” in your browser to see this result. Following points are already adapted: Sitemap contains hreflang tag Sitemap is moved to the location www.dmlights.nl/sitemap.xml We checked the DNS configuration Changed the Content language in de response header to : Content-Language: nl-NL Removed the cache with webmastertools Resolved serverrequest errors. Can anyone provide a solution to fix this problem? Thanks, Pieter0 -
If I redirect based on IP will Google still crawl my international sites if I implement Hreflang
We are setting up several international sites. Ideally, we wouldn't set up any redirects, but if we have to (for merchandising reasons etc) I'd like to assess what the next best option would be. A secondary option could be that we implement the redirects based on IP. However, Google then wouldn't be able to access the content for all the international sites (we're setting up 6 in total) and would only index the .com site. I'm wondering whether the Hreflang annotations would still allow Google to find the International sites? If not, that's a lot of content we are not fully benefiting from. Another option could be that we treat the Googlebot user agent differently, but this would probably be considered as cloaking by the G-Man. If there are any other options, please let me know.
International SEO | | Ben.JD0 -
Poor Google.co.uk ranking for a UK based .net, but great Google.com
I run an extremely popular news & community website at http://www.onedirection.net, but we're having a few ranking issues in Google.co.uk. The site gets most of its traffic from the USA which isnt a bad thing - but for our key term "one direction", we currently don't rank at all on Google.co.uk. The site is located on a server based in Manchester, UK, and we used to rank very well earlier this year - fluttering about in position 5-7 most of the time. However earlier this year, around July, we started to fall down to page 2 or 3, and at the start of this month we don't rank at all for "one direction" on Google.co.uk. On Google.com however we're very strong, always on page one. We're definitely indexed on .co.uk, just not for main search term - which I find a bit frustrating. All the content on our site is unique, and we write 2-4 stories every day. We have an active forum too, so a lot of our content is user-generated. We've never had any "unnatural link building" messages in Webmaster Tools, and our link profile looks fine to me. Do we just need more .co.uk links, or are we being penalised for something? (I can't imagine what though). It certainly seems that way though. Another site, "www.onedirection.co.uk" which is never updated and has a blatant ad for something completely unrelated on its homepage, ranks above us at the moment- which I find quite frankly appalling as our site is pretty much regarded as the worlds most popular One Direction news and fan site. We've spent the last few months improving the page-load times of our site, and we've reduced any unneccesary internal linking on the site. Approx 2 months ago we launched a new forum on the site, 301'ing all the old forum links to the new one, so that could have had an impact on rankings - but we'd expect to see an impact on Google.com as well if this was an issue. We definitely feel that we should be ranking higher on Google.co.uk. Does anyone have any ideas what the iproblems could be? Cheers, Chris.
International SEO | | PixelKicks0 -
How to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries?
Dear all, what is the best way to fix the duplicate content problem on different domains (.nl /.be) of your brand's websites in multiple countries? What must I add to my code of websites my .nl domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? What must I add to my code of websites my .be domain to avoid duplicate content and to keep the .nl website out of google.be, but still well-indexed in google.nl? Thanks in advance!
International SEO | | HMK-NL3 -
IP Redirection vs. cloaking: no clear directives from Google
Hi there, Here is our situation:we need to force an IP Redirection for our US users to www.domain.com and at the same time we have different country-specific subfolders with thei own language such as www.domain.com/fr. Our fear is that by forcing an IP redirection for US IP, we will prevent googlebot (which has an US IP) from crawling our country-specific subfolders. I didn't find any clear directives from Google representatives on that matter. In this video Matt Cutts says it's always better to show Googlebot the same content as your users http://www.youtube.com/watch?v=GFf1gwr6HJw&noredirect=1, but on the other hand in that other video he says "Google basically crawls from one IP address range worldwide because (they) have one index worldwide. (They) don't build different indices, one for each country". This seems a contradiction to me... Thank you for your help !! Matteo
International SEO | | H-FARM0