Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blocking certain countries via IP address location
-
We are a US based company that ships only to US and Canada. We've had two issues arise recently from foreign countries (Russia namely) that caused us to block access to our site from anyone attempting to interact with our store from outside of the US and Canada.
1. The first issue we encountered were fraudulent orders originating from Russia (using stolen card data) and then shipping to a US based International shipping aggregator.
2. The second issue was a consistent flow of Russian based "new customer" entries.
My question to the MOZ community is this: are their any unintended consequences, from an SEO perspective, to blocking the viewing of our store from certain countries.
-
Both answers above are correct and great ones.
From a strategical point of view, formally blocking russian IPs does not have any SEO effect in your case, because - as a business - you don't even need an SEO strategy for the Russian market.
-
Fully agree with Peter, very easy to bypass IP blocking these days, there are some sophisticated systems that can still detect but mostly outside the range of us mere mortals!
If you block a particular country from crawling your website it is pretty certain you will not rank in that country (which I guess isn't a problem anyway) but I suspect this would only have a very limited (if any) impact on your rankings in other countries.
We have had a similar issue, here are a couple of ideas.
1. When someone places an order use a secondary method of validation.
2. With the new customer entries/registrations make sure you have a good captcha, most of this sort of thing tends to be from bots. A captcha Will often fix that problem.
-
Blocking IPs on geolocation can be dangerous. But you can use MaxMind GeoIP database:
https://github.com/maxmind/geoip-api-php
or you also can implemente GeoIP in "add to cart" or "new user" as additional check. So when user is outside of US/CA you can require them to fill captcha or just ignore their requests.Now from bot point of view - if bot visit with US IP and with UK (example) IP they will see same pages. Just within UK they can't create new user or adding to cart. HTML code will be 100% same.
PS: I forgot... VPN or Proxies are cheap these days. I have few EC2 instances with everything just for mine own needs. Bad Guys also can use them so think twice about possible "protection". Note the quotes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two websites, one company, one physical address - how to make the best of it in terms of local visibility?
Hello! I have one company which will be operating in two markets, printing and website design / development. I’m planning on building two websites, each for every market. But I’m a bit confused about how to optimize these websites locally. My thought is to use my physical address for one website (build citations, get listed in directories, etc. ) and PO Box for another. Do you think there is a better idea?
Technical SEO | Sep 12, 2020, 9:10 PM | VELV1 -
Do I need to block my cart page in robots.txt?
I have a site with woocommerce. Do I need to block the cart page?
Technical SEO | Mar 6, 2015, 5:36 PM | EcommerceSite0 -
Google indexing despite robots.txt block
Hi This subdomain has about 4'000 URLs indexed in Google, although it's blocked via robots.txt: https://www.google.com/search?safe=off&q=site%3Awww1.swisscom.ch&oq=site%3Awww1.swisscom.ch This has been the case for almost a year now, and it does not look like Google tends to respect the blocking in http://www1.swisscom.ch/robots.txt Any clues why this is or what I could do to resolve it? Thanks!
Technical SEO | May 7, 2014, 2:14 PM | zeepartner0 -
How to block text on a page to be indexed?
I would like to block the spider indexing a block of text inside a page , however I do not want to block the whole page with, for example , a noindex tag. I have tried already with a tag like this : chocolate pudding chocolate pudding However this is not working for my case, a travel related website. thanks in advance for your support. Best regards Gianluca
Technical SEO | Feb 16, 2014, 11:25 PM | CharmingGuy0 -
How to do ip canonicalization ?
Hi , my website is opening with IP too. i think its duplicate content for google...only home page is opening with ip, no other pages, how can i fix it?, might be using .htaccess i am able to do...but don't know proper code for this...this website is on wordpress platform... Thanks Ramesh
Technical SEO | Jul 21, 2019, 3:30 PM | unibiz0 -
Should H1 tags include location?
I have an IT services company that is based out of Denver. In the past I always used Denver in the H1 tag like this "Denver IT Support & Managed Services" or "Denver Data Center Solutions" I know that H tags are not that important any more but I still want to put them on each page. My question is in a post panda world do those look too spammy? Should I not include Denver on each page. I have about 25 service pages that I was going to do this for. Each page will be different because of the service but I was going to include Denver on each page. On that same note how, I normally put never in the title for each page. Should I rethink this also? Obvisouly I want to rank on Denver and the service. Any help on this would be great. Thanks
Technical SEO | Jul 31, 2012, 3:22 AM | ZiaTG0 -
Use webmaster tools "change of address" when doing rel=canonical
We are doing a "soft migration" of a website. (Actually it is a merger of two websites). We are doing cross site rel=canonical tags instead of 301's for the first 60-90 days. These have been done on a page by page basis for an entire site. Google states that a "change of address" should be done in webmaster tools for a site migration with 301's. Should this also be done when we are doing this soft move?
Technical SEO | May 5, 2012, 11:42 PM | EugeneF0 -
How do I check if my IP is blocked?
We changed servers and where our sites once ranked very highly (page 1 for all sites), they now are nowhere to be seen. Someone suggested that our IP might be blocked. Someone else suggested SEOMoz was the place to go to get it checked. Any help would be GREATLY appreciated. With thanks. Bryan
Technical SEO | Jan 27, 2012, 2:58 AM | FortressLearning0