Negative SEO Click Bot Lowering My CTR?
-
I am questioning whether one of our competitors is using a click bot to do negative SEO on our CTR for our industry's main term.
Is there any way to detect this activity?
Background:
We've previously been hit by DoS attacks from this competitor, so I'm sure their ethics/morals wouldn't prevent them from doing negative SEO.
We sell an insurance product that is only offered through broker networks (insurance agents) not directly by the insurance carriers themselves. However, our suspect competitor (another agency) and insurance carriers are the only ones who rank on the 1st page for our biggest term. I don't think the carrier sites would do very well since they don't even sell the product directly (they have pages w/ info only)
Our site and one other agency site pops onto the bottom of page one periodically, only to be bumped back to page 2. I fear they are using a click bot that continuously bounces us out of page 1...then we do well relatively to the other pages on page 2 and naturally earn our way back to page 1, only to be pushed back to page 2 by the negative click seo...is my theory.
Is there anything I can do to research whether my theory is right or if I'm just being paranoid?
-
Thanks! I figured as much, but I wanted to hear it from another guru rather than assuming on my own.
I appreciate it
-
Google do not use it in a page's ranking (see here: https://www.youtube.com/watch?v=CgBw9tbAQhU). And since the referral spam is never actually on your site, but only activating your tracking code it shouldn't hurt you.
Besides, I don't think I've seen a site out there who hasn't been hit by referral spam. Some haven't been hit as hard, but still have seen it to some degree. Just set up some filters & segments so you can analyse on the real data. I found it very helpful to read this and this.
-
Thanks! I agree with everything you said. We'll be sure to remove and disavow any negative links.
As you can see from the images I posted above, we are getting quite a bit of referral spam that we need to address.
Any idea if the referral spam can negatively impact Google measurement of user experience / dwell time? I don't simply want to block it in analytics if it is poorly reflecting on our site.
-
Hi there
From an organic standpoint, I am really not sure - maybe someone can take a look for you there.
You could (and really should) check your referral spam (here's another resource) - from there you can remove and disavow possible links that are hurting your profile.
But again, from an organic standpoint, I am not entirely sure, and even then, I am not sure how you can prove it's malicious without hard proof.
I would focus on the links above and see what moves the needle for you. That's really the most you can do from my vantage point at the moment.
Hope this helps! Good luck!
-
Great links Patrick!
These are all things we're constantly working on. The info on dwell time was of particular interest. Thanks!
We might be wearing tin foil hats a bit, but the past actions of the competition have unfortunately led us down that path of thinking.
Is there any way for us to try to confirm whether click bots are being used to artificially boost the rankings of sites for a particular keyword? I can't think of any way to detect that activity.
-
Here are what sites it appears to be coming from.
-
Great tip Tim! You were right on the money. Thank you so much
We did find Russian bots with awful user experience hitting our site starting in April. My webmaster is looking in how to handle this. Any suggestions to get him started?
-
I don't believe they are clicking on my site and providing Google data that I have a bad user experience.
Instead, I think they may be providing an artificially good experience for their site and others that don't directly sell our product. As a result, my user experience is lowered relative to the others.
Once again, it's just a theory based on their prior behavior and what is actually showing in the SERPS, but something I am concerned about. If I'm right, I don't believe I'll be able to detect, nor do anything about it short of having my own click bot, which I won't do.
-
I know this may not be the case, but have you checked your site analytics to see if it is being hit by a number of these Russian bot / crawler style entities that have been doing the rounds of late, if so this could be another of the reasons you are seeing additional bounced number growing.
I found I had a few and over the last few weeks I have made an effort to remove a lot of this negative referral traffic and crawlers.
-
Hi LSlversen
I like your thinking here - to me it sounds like what your users are searching and what they are getting aren't meeting expectations. I would look into that and make sure you are on the right path with what kind content / expectations you are setting.
Here's a bit of reference to what LSlversen is chatting about. Check it out. I would take a step back, ask some hard questions, and prioritize on how you want to attack issues. You'll be better off - I highly doubt there are click bots happening.
Hope this helps a bit more!
-
I guess that's the problem...there are thousands of reasons of why the rankings are what they are and there is no way to determine if a click bot is being used. Is there?
Even if one is being used, I suppose the only thing I could do, is use one myself to counteract it, which I don't like the sound of.
-
Without knowing all the details of the case, maybe it has something to do with searcher intent? Seeing as it's only websites who don't sell the product, maybe that's exactly what the searchers are looking for?
If people who search it are clicking and spending a long time on the top results that are full of info, then it might signal to Google that they found a good result and more of that kind (information pages) will come up higher.
Again, I don't know if this is the case, but it might be a possible explanation for it.
-
Thanks for the reply Patrick
If we had 100% proof, we would have reported to Google and the FBI. Unfortunately, like most DoS attacks, it could not be traced back to anyone.
We rank well for every other industry term. The fact that sites that don't even directly sell the product make me question if something weird is going on here. My dream scenario (as is theirs I'm sure) would be to be on the first page, as the lone website that actually sells my product directly. The other sites don't rank well for any other keyword in our industry...just the biggest volume keyword.
I sent you a private message with our URL and the keyword.
-
Hi there
Can you provide a URL for your company that we can take a look into? Also, do you have proof (whether current or previously as you stated) that this other company has been attacking your site? If so, you should be reporting that to Google. I wouldn't do this though unless you are ABSOLUTELY 100% SURE this company or SEO team is doing this, and you can PROVE it. If you can't, then don't.
If you could though, please provide your URL. As you said you could be being paranoid and it could be a number of things such as your:
On-site SEO
Content
Backlinks Industry & keyword difficulty
etc....that could be hurting your rankings. It'd be easier to gauge with a URL.
Hope this helps! Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hiding ad code from bots
Hi. I have a client who is about to deploy ads on their site. To avoid bots clicking on those ads and skewing data, the company would like to prevent any bots from seeing any ads and, of course, that includes Googlebot. This seems like it could be cloaking and I'd rather not have a different version of the sites for bots. However, knowing that this will likely happen, I'm wondering how big of a problem it could be if they do this. This change isn't done to manipulate Googlebot's understanding of the page (ads don't affect rankings, etc.) and it will only be a very minimal impact on the page overall. So, if they go down this road and hide ads from bots, I'm trying to determine how big of a risk this could be. I found some old articles discussing this with some suggesting it was a problem and others saying it might be okay in some cases (links below). But I couldn't find any recent articles about this. Wondering if anybody has seen anything new or has a new perspective to share on this issue? Is it a problem if all bots (including Googlebot) are unable to see ads? https://moz.com/blog/white-hat-cloaking-it-exists-its-permitted-its-useful
White Hat / Black Hat SEO | | Matthew_Edgar
https://www.webmasterworld.com/google/4535445.htm
https://www.youtube.com/watch?v=wBO-1ETf_dY0 -
SEO - All topic related pages in same directory?
Hey Mozzers, How would you structure the following pages for SEO. The site is a multi-product / multi-topic site, but all pages in this example are based on a single topic - CRM Software: CRM Software product CRM Software related blog post 1 CRM Software related blog post 2 CRM Software related blog post 3 CRM Software downloadable resource 1 CRM Software downloadable resource 2 CRM Software downloadable resource 3 I know building directory pyramids is a bit old hat nowadays, but I still see the odd website organising the above pages, as follows: /crm-software /crm-software/crm-blog-post-1 /crm-software/crm-blog-post-2 /crm-software/crm-blog-post-3 /crm-software/crm-resource-1 /crm-software/crm-resource-2 /crm-software/crm-resource-3 However, I'm more inclined to apply a more logical structure, as follows: /crm-software /blog/crm-blog-post-1 /blog/crm-blog-post-2 /blog/crm-blog-post-3 /resources/crm-resource-1 /resources/crm-resource-2 /resources/crm-resource-3 What would you say is SEO best practice? Thanks!
White Hat / Black Hat SEO | | Zoope0 -
How authentic is a dynamic footer from bots' perspective?
I have a very meta level question. Well, I was working on dynamic footer for the website: http://www.askme.com/, you can check the same in the footer. Now, if you refresh this page and check the content, you'll be able to see a different combination of the links in every section. I'm calling it a dynamic footer here, as the values are absolutely dynamic in this case. **Why are we doing this? **For every section in the footer, we have X number of links, but we can show only 25 links in each section. Here, the value of X can be greater than 25 as well (let's say X=50). So, I'm randomizing the list of entries I have for a section and then picking 25 elements from it i.e random 25 elements from the list of entries every time you're refreshing the page. Benefits from SEO perspective? This will help me exposing all the URLs to bots (in multiple crawls) and will add page freshness element as well. **What's the problem, if it is? **I'm wondering how bots will treat this as, at any time bot might see us showing different content to bots and something else to users. Will bot consider this as cloaking (a black hat technique)? Or, bots won't consider it as a black hat technique as I'm refreshing the data every single time, even if its bot who's hitting me consecutively twice to understand what I'm doing.
White Hat / Black Hat SEO | | _nitman0 -
Malicious bots
I was looking at some recommended keywords and felt sick to my stomach when I saw ilovevitaly.com search shell, resellerclub scam and a few more. | 2. | | 28(2.29%)ilovevitaly.com search shell | 0.00% | 0(0.00%) | 42.86% | 1.75 | 00:10:13 | 0.00% | 0(0.00%) | $0.00(0.00%) |
White Hat / Black Hat SEO | | BlueprintMarketing
| | 3. | resellerclub scam | I believe I have found the multiple IP addresses in which they're coming from and when I say many I mean I found 200 or so. There from different C blocks so they're very difficult to block easily without blocking legitimate traffic. I'm using a couple of different web application firewalls with the ability to block it pretty much anything. Does anyone have any device on doing this in a manner that might be more efficient than what I'm doing.I definitely do not want Google to think this is something that I did and penalize somebody this would be horrible. The site is going through Sucuri.net to be cleaned of any possible infection right now I do not know how this happened but zero day attacks are unfortunately a very real reality and unfortunately it could've been 1 million things. Thanks a million guys. I appreciate your help,
Tom0 -
Asynchronous loading of product prices bad for SEO?
We are currently looking into improving our TTFB on our ecommerce site. A huge improvement would be to asynchronously load the product prices on the product list pages. The product detail page – on which the product is ordered- will be left untouched. The idea is that all content like product data, images and other static content is sent to the browser first(first byte). The product prices depend on a set of user variables like delivery location, vat inclusive/exclusive,… etc. So they would requested via an ajax call to reduce the TTFB. My question is whether google considers this as black hat SEO or not?
White Hat / Black Hat SEO | | jef22200 -
Negative SEO and when to use to Dissavow tool?
Hi guys I was hoping someone could help me on a problem that has arisen on the site I look after. This is my first SEO job and I’ve had it about 6 months now. I think I’ve been doing the right things so far building quality links from reputable sites with good DA and working with bloggers to push our products as well as only signing up to directories in our niche. So our backlink profile is very specific with few spammy links. Over the last week however we have received a huge increase in backlinks which has almost doubled our linking domains total. I’ve checked the links out from webmaster tools and they are mainly directories or webstat websites like the ones below | siteinfo.org.uk deperu.com alestat.com domaintools.com detroitwebdirectory.com ukdata.com stuffgate.com | We’ve also just launched a new initiative where we will be producing totally new and good quality content 4-5 times a week and many of these new links are pointing to that page which looks very suspicious to me. Does this look like negative Seo to anyone? I’ve read a lot about the disavow tool and it seems people’s opinions are split on when to use it so I was wondering if anyone had any advice on whether to use it or not? It’s easy for me to identify what these new links are, yet some of them have decent DA so will they do any harm anyway? I’ve also checked the referring anchors on Ahrefs and now over 50% of my anchor term cloud are totally unrelated terms to my site and this has happened over the last week which also worries me. I haven’t seen any negative impact on rankings yet but if this carries on it will destroy my link profile. So would it be wise to disavow all these links as they come through or wait to see if they actually have an impact? It should be obvious to Google that there has been a huge spike in links so then the question is would they be ignored or will I be penalised. Any ideas? Thanks in advance Richard
White Hat / Black Hat SEO | | Rich_9950 -
Question about local SEO when you serve many more cities than you have brick and mortar locations
My URL is: http://www.mollysmusic.org for the record.I run a music school that serves in-home lessons to a whole slew of cities. Since I only have 3 brick-and-mortar locations, I can't make google local profiles for all the cities served, but I want to get seen by those people searching in their own cities. Right now, our biggest competitor, takelessons.com, is top ranked for every single city you can think of, because they have individual web pages for every city served. Their content is repetitive and scrapey, and to me, that says "doorway page" which supposedly can get you de-indexed. I'm reluctant to do that because I'm afraid I'll get banned, but I have to compete. I also want a strategy that can scale when we move into new areas. Is there something that makes TakeLessons's content NOT a doorway page? What's the best practice for getting ranked in multiple individual cities if you run a service? Thanks in advance.
White Hat / Black Hat SEO | | mollysmusic0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0