Block access to site from everywhere but north america
-
I have a site that is being attacked very hard by bots, malware, etc. Most of it seems to be originating from Asia and Eastern Europe so I want to block off access to the site to everybody but people in North America. We do not ship out of the country anyways so it really does not need to be seen by people around the world.
How can I set this up?
-
I do not see that function in the free version of cloud flare? I am adding the "challenge" rule to hopefully cut back. My client does not have the money for the paid plans.
My next step is to go disavow those links.
-
That is exactly the problem we had. Cloudflare helped with that as well as blocking the ips with our old webhost.
Also, make sure that you disavow the links from those domains in webmaster tools as well.
Good luck
Ken
-
It is people that are duplicating our website, hacking other servers, then uploading our modified malware filled site to these servers.
I do not care about fake referral traffic. I just need to totally 100% block traffic from Russia. I do not want them to even see the site.
-
Hi-
We had a similar situation which got even worse when someone initiated a DDOS attack on our site from out of the country. Since then we have used cloudflare.com and things have been a lot better.
Good Luck
Ken
-
Then you can follow the last part of my previous answer. Which are the spammers that you are seeing? If it's ghost spam like 4webmasters, free-social-buttons, then the only way to stop them is with filters in google analytics
-
It is drastic but it's needed/ We do not ship anywhere outside of the continent so there really is no need for traffic from Russia.
I want an easy solution that I an put in the robots.txt file or something
-
Hi Noah,
Excluding the whole world would be a drastic solution, I know the spam is a bad issue, but you could be missing real traffic even if you are a local website, and even if you don't, you will still be getting spam coming from USA and there is quite some. I recommend you to try other solution a filter based on your hostnames.
This solution requires a little more time to set up, but it has 3 huge advantages, and you won't have to exclude all the world except USA.
- You will stop the spam before it hits you, adding a filter for the referral after you see it will stop it, but by the time you apply it you will have already hits of the spam.
- You will need only ONE filter to stop all ghost spam, instead of creating various sets of filters.
- Lately, some of the spammers(e.g. free-social-buttons) have been hitting GA accounts with fake direct visits along with the referral, the filter for the referral won’t stop the direct visit, on the other hand. The Valid hostname filter will stop ALL ghost spam in any form whether it shows as a referral, keyword or direct visit.
This is what I've been using on my accounts for the last moths and I haven't received a single hit of ghost spam. You can find more information of how this filter works and a detailed guide to set it up in this article.
http://www.ohow.co/what-is-referrer-spam-how-stop-it-guide/
If you are not convinced and still want to exclude all countries you can follow the same guide for the valid hostname on the article and just change the filter field for Country and put United States in the filter pattern
Hope it helps,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Site Merge
Hello, I've never had to deal with an international site before, let alone a site merge. These are two large sites, we've got a few smaller old sites that are currently redirecting to the main site (UK). We are looking at moving all the sites to the .com domain. We are also currently not using SSL (on the main pages, we are on the checkout). We also have a m.domain.com site. Are there any good guides on what needs to be done? My current strategy would be: Convert site to SSL. Mobile site and desktop site must be on the same domain. Start link building to the .com domain now (weaker link profile currently) What's the best way of handling the domains and languages? We're currently using a .tv site for the UK and .com for the US. I was thinking, and please correct me if i'm wrong, that we move the US site from domain.com to domain.com/us/ and the domain.tv to domain.com/en/ Would I then reference these by the following: What would we then do with the canonicals? Would they just reference their "local" version? Any advice or articles to read would really be appreciated.
International SEO | | ThomasHarvey0 -
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
Blocking domestic Google's in Robots.txt
Hey, I want to block Google.co.uk from crawling a site but want Google.de to crawl it. I know how to configure the Robots.txt to block Google and other engines - is there a fix to block certain domestic crawlers? any ideas? Thanks B
International SEO | | Bush_JSM0 -
How to handle rel canonical on secondary TLD's - multi regional sites.
I currently have a .com domain which I am think of duplicating the content on to another tld, CO.UK (and regionalize some aspects like contact numbers etc). From my research, it seems that in gwt you must then indicate which country you wish to target, in the co.uk case the UK. My question is how should I handle rel canonical in the duplicated site. should it rel canonical back to the .com or the co.uk? Any other pointers would also be appreciated. Thx
International SEO | | dmccarthy0 -
Multi country targeting for listing site, ccTLD, sub domain or .com/folder?
Hi I know this has been covered in a few questions but seen nothing recent that may take into account changes google may have applied. We would like to target multiple english speaking counties with a new project and I'm a little unsure as to whether ccTLD, subdomain or subfolders are the best way to publish country specific information. Can anyone shed some light on this?
International SEO | | Mulith0 -
My site is not showing on google.com ?
My website is not showing at all on google.com searches under search terms. It is showing if i search for my domain (xyz.com) but not by keywords. Searching by keywords and search terms shows my website on yahoo , bing and non US oogle as google.co.uk or google.com.pk on first page. Would appreciate any help in trying to understand why this is happening? The website is medicare.md. if you search for term "medicare doctors PG county maryland" it is #1 in bing and yahoo but not even showing on google.com first TEN pages, although not banned. Interestingly if you do that search on google.co.pk it is #4. Quite Puzzuling !! Would appreciate any help or advice . Sherif Hassan
International SEO | | sherohass0 -
Moving British site to the US... who will have .com? US or UK?
We are the UK's first baby social commerce site launched in Nov 2011. We're doing quite well and are looking at expanding to the US. However I'm not sure what advice you'd give me in terms of internationalising the site. I see three options on how to deal with the URL structure? Make US site as .com as it will be my main source of revenue for the long run and redirect all British traffic to .co.uk Have .com for both UK and US but have the URL as either: us.babyhuddle.com or as babyhuddle.com/us/. Same thing for the UK Another option? Would love to hear the feedback from you guys. Thanks, Walid
International SEO | | walidalsaqqaf0 -
Site structure for multi-lingual hotel website (subfolder names)
Hi there superMozers! I´ve read a quite a few questions about multi-lingual sites but none answered my doubt / idea, so here it is: I´m re-designing an old website for a hotel in 4 different languages which are all** hosted on the same .com domain** as follows: example.com/english/ for english example.com/espanol/ for **spanish ** example.com/francais/ for french example.com/portugues/ for portuguese While doing keyword search, I have noticed that many travel agencies separate geographical areas by folders, therefor an **agency pomoting beach hotels in South America **will have a structure as follows: travelagency.com/argentina-beach-hotels/ travelagency.com/peru-beach-hotels/ and they list hotels in each folder, therefor benefiting from those keywords to rank ahead of many independent hotels sites from those areas. What **I would like to **do -rather than just naming those folders with the traditional /en/ for english or /fr/ for french etc- is take advantage of this extra language subfolder to_´include´_ important keywords in the name of the subfolders in the following way (supposing the we have a beach hotel in Argentina): example.com/argentina-beach-hotel/ for english example.com/hotel-playa-argentina/ for **spanish ** example.com/hotel-plage-argentine/ for french example.com/hotel-praia-argentina/ for portuguese Note that the same keywords are used in the name of the folder, but translated into the language the subfolders are. In order to make things clear for the search engines I would specify the language in the html for each page. My doubt is whether google or other search engines may consider this as ´stuffing´ although most travel agencies do it in their site structure. Any Mozers have experience with this, any idea on how search engines may react, or if they could penalise the site? Thanks in advance!
International SEO | | underground0