Site under attack from Android SEO bots - expert help needed
-
For last 25 days, we are facing a weird attack on our site.
We are getting 10x the normal mobile traffic - all from Android, searching for our name specifically. We are sure that this is not authentic traffic as the traffic is coming from Organic searches and bouncing off. Initially, we thought this was a DDoS attack, but that does not seem to be the case.
It looks like someone is trying to damage our Google reputation by performing too many searches and bouncing off.
Has any one else faced a similar issue before? What can be done to mitigate the impact on site.
(FYI - we get ~2M visits month on month, 80% from Google organic searches). Any help would be highly appreciated.
-
Just as EGOL describe it.
If you're on Amazon AWS then you can use their CloudFront as CDN. But also you can observe source of traffic. Could coming from one country, one IP range or one user-agent. There should be some kind of pattern and you should investigate it.
Then just need to make rule to block that traffic or just redirect them to one static "hello world" page.
I was also victim of such traffic, but was from humans trying to depleting an AdWords daily budget. Once budget it over ads was stopped showing, after few hours they recalculate clicks, some funds was returned, ads are shown again, they click it, budget is over... and so on.
-
By resetting your DNS to CF, your server is no longer used. All traffic is routed to one of CF's data centers and there are over 100 of them distributed throughout the world.
Also, in the CF settings, you want to "challenge" the visitors from problem countries. This will give them a captcha to complete. When they complete that captcha one time, you can then give them long term access without the challenge. CF will progressively become better at filtering the bots and allowing more trusted visitors in without a challenge.
-
Thanks for your help - this works to a large degree.
Have hit a new challenge though, our AWS servers are in one of these countries which are sending traffic. And we have multiple servers talking to each other enabling Login / other actions on the site.
While I have blocked all the other countries, blocking country with AWS servers is creating problem with Login. Trying to figure this out!
-
If you don't use Firewall, Cloudflare in your situation will have almost no effect.
We used our analytics to determine the countries where the traffic was coming from. Then went into CF FW.
Click the blue Help link for each tool to decide upon the settings that you want to try.
Here is what we used....
Security Level... Medium
Challenge Passage... one day
Access rules.... country name, challenge, this website
Impact of the above.... Many bots already recognized by CF will be blocked. Access rules will present each visitor from those countries a form similar to a captcha. They must pass the captcha to get in.
After you turn this on, watch your short term stats. You should see an increase in blocking.
We ran the above for a few weeks without any obvious SEO impact. Then switched our DNS back to normal, moving away from CF.... but kept the $20/month account and our settings in place. CF was time-consuming to set up.
-
This looks very similar to what we are seeing. We took CloudFlare as well - but stayed with Free account with "Site Under Attack" mode, which should force the visits to verify.
Will it be possible for you to share the settings on CloudFlare? Did you use their Firewall as well? Also, did you see any SEO impact, by any chance?
-
One morning, a few months ago we saw lots of mobile phone traffic building. All was hitting our homepage which is very resource intensive. All of this traffic generated one page view. All of the traffic was coming from a few countries in Asia and Africa. No referrer. Looked like a DDOS attack.
We go to Cloudflare, got a $20/month account, switched DNS to CF, forced untrusted visits from those countries to verify before allowing entry. Squeezed this traffic down to almost nothing within a few hours. Left CF run for a few weeks. Rouge traffic disappeared.
Now we have CF ready to go with all settings in place. Can turn it on in two minutes and have the shield in place as DNS propagates.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does traffic for branded searches help a site rank for general terms?
A year or two ago we put up some websites which were specific to brands we own. Sure enough those sites (eg 'myBrand.com') started to rank pretty well for those brand terms eg 'mybrand curling tongs' (it's not curling tongs, btw, but you get the idea). We were getting a decent amount of traffic presumably from people who have bought or seen these products on our amazon/ebay stores. Before long, we see us starting to rank well for non branded searches eg 'curling tongs' even among decent competition. Next thing you know I'm getting told by the boss that we need to put up websites for all specific ranges, not just brands, because specificity is a bonus for ranking well. While there's probably a point that a site for MybrandCurlingTongs lends itself well to ranking for curling tongs, is there also an element that the branded searches we got (via making our brand known on amazon/ebay) helped the site gain recognition and authority? As such a new website about 'ionising hair dryers' would not rank well based on being specific, because it wouldn't be helped by a lot of branded traffic?
Intermediate & Advanced SEO | | HSDOnline2 -
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
I need help on how best to do a complicated site migration. Replacing certain pages with all new content and tools, and keeping the same URL's. The rest just need to disappear safely. Somehow.
I'm completely rebranding a website but keeping the same domain. All content will be replaced and it will use a different theme and mostly new plugins. I've been building the new site as a different site in Dev mode on WPEngine. This means it currently has a made-up domain that needs to replace the current site. I know I need to somehow redirect the content from the old version of the site. But I'm never going to use that content again. (I could transfer it to be a Dev site for the current domain and automatically replace it with the click of a button - just as another option.) What's the best way to replace blahblah.com with a completely new blahblah.com if I'm not using any of the old content? There are only about 4 URL'st, such as blahblah.com/contact hat will remain the same - with all content replaced. There are about 100 URL's that will no longer be in use or have any part of them ever used again. Can this be done safely?
Intermediate & Advanced SEO | | brickbatmove1 -
Help Dealing with Sustained Negative SEO Attack
Hello, I am hoping that someone is able to help with a problem that is destroying both my business and my health. We are an ecommerce site who have been trading since 2004 and who have always had strong rankings in Google. Unfortunately, over the past couple of months, these have significantly decreased (I would estimate around 40% drop in organic traffic). We have not had a manual penalty and still have decent rankings for a lot of competitive keywords, so we think it is more likely to be an algorithmic penalty.The most likely culprit is due to a huge scale negative SEO attack that has been going on for around 18 months. Last September, we suffered a major drop in rankings as a result of the 302 hijack scheme, but after submitting a disavow file (of around 500 domains) on 12th November, we recovered on 26th November (although we now don't know whether this was due to disavow file or the Phantom III update on 19th November).After suffering another major drop at the end of June, we submitted a disavow file of 1100 domains (this the scale of the problem!). This tempoarily halted the slide, however it is getting worse again. I have attached a file from Majestic which shows the increase in the backlinks (however we are not building these).We are at a loss and desperately need help. We have contacting all the sites to try and get links removed but they are happening faster than we can contact them. We have also done a full technical audit and added around 50,000 words of unique, handwritten content, as well as continuing to work through all technical fixes and improvements.At the moment, the only thing we can think of doing is submitting a weekly disavow for all the new spammy domains that come up. The questions I have are: Is there anything we can do to stop the attack? Is this increase in backlinks likely to be the culprit for the drops (both the big drops and the subsequent weekly 10% drop)? If so, would weekly disavows solve the problem? Is this likely to take months (years?) to recover from or can it be done quicker? Can you give me any ray of light to help me sleep at night? 😞 Really appreciate any and all help. I wouldn't wish ths on anyone.Thanks,Simon
Intermediate & Advanced SEO | | simonukss0 -
SEO impact when micro site are hosted on third party url
Have a look at this url
Intermediate & Advanced SEO | | kjerstibakke
http://www.bekkevold-as.no/.
On the front page they are providing links to a website created by one of their suppliers. (Three cirkle icons in the middle of the page).
The links make the customers leave the site and open on a different domain, even the branding of the page is for Bekkevold Elektro (see log top right)
The question from me: How will this impact the SEO for Bekkevold Elektro. Would it be better to create a subdomain for Bekkevold Elektro and ask the supplier to point to this? Or is it ok to leave it as it is, having links back to the web site for Bekkevold Elektro? Just changing the target of the link to open in a new window.
Thank you very much for your support on this. I would appreciate any suggestion for improvement for my customer, Bekkevold Elektro. Best regards
Kjersti Bakke0 -
Why are these sites outranking me?
I am trying to rank for the phrase "a link between worlds walkthrough" I am on page 1 but there are several results that just outranks me and I cannot see any reason that they would be doing so. My site is hiddentriforce.com/a-link-between-worlds/walkthrough/ For that page I have 5 linking domains, varied anchor text that spans from things like "here" to a variety of related phrases. All of the links come from really good sites My page has 1400 likes, 90 shares, and about 20 each in tweets and +'s DA of 44 PA of 37 The 4 and 5 ranked sites both have WAY less social interactions, lower PA and DA, less links, etc Yet they outrank me why?
Intermediate & Advanced SEO | | Atomicx0 -
I need help with setting the preferred domain; www. or not??
Hi! I'm kinda new to the SEO game and struggling with this site I'm working on: http://www.moondoggieinc.com I set the preferred domain to www. in GWT but I'm not seeing it reroute to that? I can't seem to get any of my internal pages to rank, and I was thinking it's possiblly b/c of a duplicate content issue cause by this problem. Any help or guidance on the right way to set preferred domain for this site and whiy I can't get my internal pages to rank? THANKS! KristyO
Intermediate & Advanced SEO | | KristyO0 -
How do you prevent the mobile site becoming a duplicate of the full browser site?
We have a larger site with 100k+ pages, we need to create a mobile site which gets indexed in the mobile engines but I am afraid that google bot will consider these pages duplicates of the normal site pages. I know I can block it on the robots.txt but I still need it to be indexed for mobile search engines and I think google has a mobile crawler as well. Feel free to give me any other tips that I should follow while trying to optimize the mobile version. Any help would be appreciated 🙂
Intermediate & Advanced SEO | | pulseseo0