Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What countries does Google crawl from? Is it only US or do they crawl from Europe and Asia, etc.?
-
Where does Google crawl the web from? Is it in the US only, or do they do it from a European base too? The reason for asking is for GeoIP redirection. For example, if a website is using GeoIP redirection to redirect all US traffic to a .com site and all EU traffic to a .co.uk site, will Google ever see the .co.uk site?
-
Hi Keith,
In my experience Google mainly crawls from the US.
You're quite right, GeoIP redirection can cause major issues with indexation - as if you're redirecting everything from the US to the .com googlebot can't see the .co.uk site.
As such, I'm not a fan. Rather than implementing a hard redirect I prefer Amazon's approach. If you visit the amazon.com site from a UK IP you get a javascript overlay that invites you to visit the .co.uk version of the site instead - they let the user decide which site to view rather than actually redirecting them.
This is a nice solution as it ensures that the search bots can crawl both versions of the site, and rankings aren't endangered.
I hope this helps,
Hannah
-
Keith, I am having the same issue and I agree with you. The fact that Google has data centers in Europe does not necessarily mean the algos are indexing from there. I also want to set Europe and US GeoIP redirection. It would be great to get Mozers opinions on this. Hopefully this post gets freshly reviewed
-
Interesting question - I'd quite like to know what happens here too.
Matt Cutts recently posted a video on cloaking (http://youtu.be/QHtnfOgp65Q) saying that as long as you don't do anything 'special' for Googlebot you're OK, but presumably if you are redirecting IP's based on location and you don't want to prevent Googlebot form accessing your site then effectively you have to do something 'special' for Googlebot (e.g. you're doing one thing for everyone else and a different thing for Googlebot).
-
Hi,
The data center locations is interesting, but it isn't what I was looking for. I need to know whether Google crawls the web from any IP other than US IPs.
To clear up the second question, let me be more specific:
Let's say Google is crawling a .co.uk site from a US IP address. The site is using GEO IP redirection to redirect all US traffic to the .com site. Therefore, when Google attempts to crawl the .co.uk from the US IP address site it will be redirected to the .com site, never seeing the .co.uk site. Can anyone confirm that this is what happens?
-
I found an article from 2008 that shows Google data centre locations: http://bit.ly/mONhf9
Your other question is a bit confusing. Why Google wouldn't see the UK site?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
Worldwide and Europe hreflang implementation.
Hi Moz ! We're having quite a discussion here and I'd like to have some inputs. Let me explain the situation and what we plan to do so far. One of our client has two separate markets : World and Europe. Both pages versions will be mostly the same, except for the fact that they will have their own products. So basically, we'd want to show only the European EN version to Europe and the standard EN version to the rest of the world, same goes for FR and ES. As far as IT, DE, CS and SK, they will only be present within the european version. Since we cannot target all Europe with a single hreflang tag, we might have to do it for every single european countries. Regarding this subject, SMX Munich recently had quite an interesting session about this topic with a confirmation coming from John Mueller saying that we can target a single URL more than once with different hreflang tags. You can read more here : http://www.rebelytics.com/multiple-hreflang-tags-one-url/ So having all this in mind, here's the implementation we plan to do : www.example.com/en/ Self canonical www.example.com/fr/ - hreflang = fr www.example.com/es/ - hreflang = es www.example.eu/it/ - hreflang = it www.example.eu/de/ - hreflang = de www.example.eu/cs/ - hreflang = cs www.example.eu/sk/ - hreflang = sk www.example.eu/fr/ - hreflang = be-fr www.example.eu/fr/ - hreflang = ch-fr www.example.eu/fr/ - hreflang = cz-fr www.example.eu/fr/ - hreflang = de-fr www.example.eu/fr/ - hreflang = es-fr www.example.eu/fr/ - hreflang = fr-fr www.example.eu/fr/ - hreflang = uk-fr www.example.eu/fr/ - hreflang = gr-fr www.example.eu/fr/ - hreflang = hr-fr etc… . This will be done for all european countries (FR, EN and ES). www.example.com/en/ - x-default Let me know what you guys think. Thanks!
International SEO | | Netleaf.ca0 -
Targeting Countries in the Middle East
Hi guys, I have a client based in the Middle East using a generic top level domain (.com), and they want to target multiple countries in the GCC (UAE, Saudi Arabia, Kuwait, Qatar etc). I’m thinking that using the hreflang tag would be the best solution here, however the pages will mostly have the exact same content. There will only be slight changes on some pages in terms of using localised title tags [client service] followed by [targeted country], h1's and meta descriptions. Is this the correct approach? And if so should this be implemented side wide or can it be implemented on selected pages only? The site will be in English only.
International SEO | | Jbeetle0 -
Problems with the google cache version of different domains.
We have problems with the google cache version of different domains.
International SEO | | Humix
For the “.nl” domain we have an “.be” cache..
Enter “cache:www.dmlights.nl” in your browser to see this result. Following points are already adapted: Sitemap contains hreflang tag Sitemap is moved to the location www.dmlights.nl/sitemap.xml We checked the DNS configuration Changed the Content language in de response header to : Content-Language: nl-NL Removed the cache with webmastertools Resolved serverrequest errors. Can anyone provide a solution to fix this problem? Thanks, Pieter0 -
How To Rank A UK Website On Google.com (US)
Hi, I've done some research on this but couldn't find any definitive answer I can trust! We have a client who resides in the UK. They have '.com' domain, hosted on a UK server, using UK spelling. Their business objective for this year is to expand in the USA, including the opening of a warehouse over there. They are wanting us to rank their website on both Google.co.uk and Google.com (North America); besides changing the geolocation settings in GWT's, and building links from .com websites is there anything else we can do to increase their visibility on Google.com? Many thanks in advance, appreciated!
International SEO | | Webpresence
Lee.0 -
Blocking domestic Google's in Robots.txt
Hey, I want to block Google.co.uk from crawling a site but want Google.de to crawl it. I know how to configure the Robots.txt to block Google and other engines - is there a fix to block certain domestic crawlers? any ideas? Thanks B
International SEO | | Bush_JSM0 -
Poor Google.co.uk ranking for a UK based .net, but great Google.com
I run an extremely popular news & community website at http://www.onedirection.net, but we're having a few ranking issues in Google.co.uk. The site gets most of its traffic from the USA which isnt a bad thing - but for our key term "one direction", we currently don't rank at all on Google.co.uk. The site is located on a server based in Manchester, UK, and we used to rank very well earlier this year - fluttering about in position 5-7 most of the time. However earlier this year, around July, we started to fall down to page 2 or 3, and at the start of this month we don't rank at all for "one direction" on Google.co.uk. On Google.com however we're very strong, always on page one. We're definitely indexed on .co.uk, just not for main search term - which I find a bit frustrating. All the content on our site is unique, and we write 2-4 stories every day. We have an active forum too, so a lot of our content is user-generated. We've never had any "unnatural link building" messages in Webmaster Tools, and our link profile looks fine to me. Do we just need more .co.uk links, or are we being penalised for something? (I can't imagine what though). It certainly seems that way though. Another site, "www.onedirection.co.uk" which is never updated and has a blatant ad for something completely unrelated on its homepage, ranks above us at the moment- which I find quite frankly appalling as our site is pretty much regarded as the worlds most popular One Direction news and fan site. We've spent the last few months improving the page-load times of our site, and we've reduced any unneccesary internal linking on the site. Approx 2 months ago we launched a new forum on the site, 301'ing all the old forum links to the new one, so that could have had an impact on rankings - but we'd expect to see an impact on Google.com as well if this was an issue. We definitely feel that we should be ranking higher on Google.co.uk. Does anyone have any ideas what the iproblems could be? Cheers, Chris.
International SEO | | PixelKicks0 -
Google Webmaster Tools - International SEO Geo-Targeting site with Worldwide rankings
I have a client who already has rankings in the US & internationally. The site is broken down like this: url.com (main site with USA & International Rankings) url.com/de url.com/de-english url.com/ng url.com/au url.com/ch url.com/ch-french url.com/etc Each folder has it's own sitmap & relative content for it's respective country. I am reading in google webmaster tools > site config > settings, the option under 'Learn More': "If you don't want your site associated with any location, select Unlisted." If I want to keep my client's international rankings the way it currently is on url.com, do NOT geo target to United States? So I select unlisted, right? Would I use geo targeting on the url.com/de, url.com/de-english, url.com/ng, url.com/au and so on?
International SEO | | Francisco_Meza0