Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
-
Hello,
I'm hoping one of you search geniuses can help me.
We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account.
The way I see it, there are a couple of possibilities.
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm.2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site.
The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site.
If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you.
Thanks for your help.
Eric
-
Hi SirMax,
Thanks for your input. I appreciate it. We'll add Wordfence to our WordPress toolbox and see if that addresses the issue.
In response to previous posts, thanks to everyone for your input. We were able to apply some filters to remove the bogus bot traffic from the analytics and normalize the data, however, this did not actually resolve the issue and in my eyes is more of a BandAid fix. The evil crawlers are still there, we just can't see them.
Thanks again for all of your input.
Eric
-
Hostname filtering does not work any more. Unfortunately most of the spammers have adapted and are using your website as hostname.
For the WordPress I use Wordfence plugin( using paid version - not affiliated with them in any shape or form beyond paying for their services). In the advance blocking you can set limits on how fast and how many pages crawlers can request. You can also block by country or ip range. It can also show you live traffic with a lot of details ( a lot more then google analytic - more like server log ). It might not be the complete remedy but it can help.
-
I wish I had an answer for how to stop the bots from hitting your site at all - I don't think a good one exists, as any solutions that wouldn't also block real human traffic to your site are going to be easy for spam bots to get around. I think your best bet is just to do everything you can to keep your data as clean as possible.
-
Hi Ruth,
Thanks a bunch for taking the time to respond to my post. Great advice. This is reassuring on a number of levels, however, it doesn't address the underlying issue of how to stop these spam bots in the first place.
We've already started the process of filtering out some of this bogus data. We'll also be integrating some WordPress plugins to see if that helps. That said, if the spam bots are hitting Analytics directly, as opposed to the actual website, WP plugins won't do anything.
Anyway, I appreciate your input and advice. Thanks so much.
Eric
-
Hi Eric,
A few things to reassure you off the bat:
- For what it's worth, there is a huge, HUGE amount of crawler spam happening in the web today. Every site I work on is being hit hard with false referrals and direct visits. I know Google Analytics is working on a solution to better filter these visits out. So I wouldn't be too concerned that it is something a competitor is doing to your site, specifically - it's more likely that it's been caught up in the general wave of spam crawlers.
- It's important to note that when we talk about Google looking at bounce rate and dwell time as part of ranking your site, those numbers are specifically from clicks through from search - that's data that Google can get without using your private web analytics data as a ranking factor, which they've said repeatedly that they don't and won't do. So a bunch of direct visits with high bounce rates will NOT affect your rankings.
So, it's not dangerous, just annoying. On to how to get that data out of your reports:
- Make sure you're not filtering out spam referrers at a View level - this can cause those visits to incorrectly appear as direct traffic.
- You could set up an Advanced Segment in Google Analytics to filter out direct visits with visit times of, say, under 5 seconds. Some real traffic may get caught in that, but it will get the noise levels down.
- The best way to filter out spam bot traffic, in my opinion, is to set up hostname filtering. Here's a post on Megalytic on how to do that: https://megalytic.com/blog/how-to-filter-out-fake-referrals-and-other-google-analytics-spam. Make sure you've also got an "Unfiltered Data" View so you'll still have historic raw data if you need it.
Hope that helps! Good luck.
-
Check webserver log files, or log visits (ip address, user agent, __utma, __utmz, possibly browser fingerprint, etc...)
Analyzing those you can easily find out if the traffic is from scraping bot or humans.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's Causing My Extremely Low Bounce Rate
My client's site that is reporting an under 10% bounce rate for all sources. Direct is the highest at 8%. I'm no expert in GA but wondering if there is a problem with the analytics/tag manager code on the site. I'm especially concerned about the GTM body script being in an iframe which I read could be trouble. <!-- Google Tag Manager (noscript) -->
Reporting & Analytics | | bradsimonis
<noscript><iframe src="https://www.googletagmanager.com/ns.html?id=GTM-MWGMNW6"
height="0" width="0" style="display:none;visibility:hidden"></iframe></noscript>
<!-- End Google Tag Manager (noscript) --> You can see all the source code here:
view-source:https://nfinit.com/0 -
Solved How to solve orphan pages on a job board
Working on a website that has a job board, and over 4000 active job ads. All of these ads are listed on a single "job board" page, and don’t obviously all load at the same time. They are not linked to from anywhere else, so all tools are listing all of these job ad pages as orphans. How much of a red flag are these orphan pages? Do sites like Indeed have this same issue? Their job ads are completely dynamic, how are these pages then indexed? We use Google’s Search API to handle any expired jobs, so they are not the issue. It’s the active, but orphaned pages we are looking to solve. The site is hosted on WordPress. What is the best way to solve this issue? Just create a job category page and link to each individual job ad from there? Any simpler and perhaps more obvious solutions? What does the website structure need to be like for the problem to be solved? Would appreciate any advice you can share!
Reporting & Analytics | | Michael_M2 -
Fire a tag when element is loaded on page (Google Tag Manager)
I'm using an Element Visibility trigger to track a value that appears on a page. However, I want to track this value even when the user doesn't scroll to the area of the page where the element is (i.e. when the page is loaded, and the value is displayed below the fold, but the user doesn't scroll down there). Is there a way of doing this
Reporting & Analytics | | RWesley0 -
Should I use sessions or unique visitors to work out my ecommerce conversion rate?
Hi all First question here but I've been lingering in the shadows for a while. As part of my companies digital marketing plan for the next financial year we are looking at benchmarking against certain KPIs. At the moment I simply report our conversion rate as Google Analytics displays it. I was incorrectly under the impression that it was reported as unique visits / total orders but I've now realised it's sessions / total orders. At my company we have quite a few repeat purchasers. So, is it best that we stick to the sessions / total orders conversion rate? My understanding is multiple sessions from the same visitor would all count towards this conversion rate and because we have repeat purchasers these wouldn't be captured under the unique visits / total orders method? It's almost as if every session we would have to consider that we have an opportunity to convert. The flip side of this is that on some of our higher margin products customers may visit multiple times before making a purchase. I should probably add that I'll be benchmarking data based on averages from the 1st April - 31st of March which is a financial year in the UK. The other KPI we will be benchmarking against is visitors. Should we change this to sessions if we will be benchmarking conversion rate using the sessions formula? This could help with continuity and could also help to reveal whether our planned content marketing efforts are engaging users. I hope this makes sense and thanks for reading and offering advice in advance. Joe
Reporting & Analytics | | joe-ainswoth1 -
Google Analytics reporting traffic for 404 pages
Hi guys, Unique issue with google analytics reporting for one of our sites. GA is reporting sessions for 404 pages (landing pages, organic traffic) e.g. for this page: http://www.milkandlove.com.au/breastfeeding-dresses/index.php the page is currently a 404 page but GA (see screenshot) is reporting organic traffic (to the landing page). Does anyone know any reasons why this is happening? Cheers. http://www.milkandlove.com.au/breastfeeding-dresses/index.php GK0zDzj.jpg
Reporting & Analytics | | jayoliverwright2 -
Google Analytics Question - Impressions & Queries Up, Sessions Down
I'm working with a client who, according to the Google Query report, impressions and sessions are up since we've started work with them about 6 months ago, but Google sessions are down. In moz, we're seeing a gradual, but steady increase in search visibility specifically with Google. Note: this is all organic. From when we started tracking queries, the first month we were tracking there were 43,581 impressions and 690 click throughs for the month. This past month there were 98,293 queries and 1015 clicks throughs for the month (granted not year over year data) - of these 1,015 clicks, 995 of them were from web. However, for those same time periods, sessions from Google are down over 30% - 1,750 vs. 1,189. I'm not sure how to interpret this. I realize that clicks and sessions are not a straightforward comparison, but I would think that if clicks were up according to the query report that sessions would also be up. Is it that some of these clicks are bouncing and therefore not being tracked as a session? Is there a potential issue with how data is being tracked?
Reporting & Analytics | | Corporate_Communications0 -
Direct Traffic from Ashburn, VA
We've seen a huge spike in traffic form Ashburn, VA every Monday. It's wrecking our analytics. I don't want to create a filter based on location because we should receive legitimate traffic from that location. I see there are a few other identifiers that make me think I could add a filter for just those items (iOS 5, Safari). Does anyone have a current best-practice for this type of problem? Tx!
Reporting & Analytics | | fishlizzer2 -
A/B Tests: How To Verify Difference In Average Order Value?
Hi there! When the data from an A/B test shows a difference in AOV between the variants, how do you determine if this difference is statistically probable or not? Best Regards, Martin
Reporting & Analytics | | TalkInThePark0