Regular Expressions for Filtering BOT Traffic?
-
I've set up a filter to remove bot traffic from Analytics. I relied on regular expressions posted in an article that eliminates what appears to be most of them.
However, there are other bots I would like to filter but I'm having a hard time determining the regular expressions for them.
How do I determine what the regular expression is for additional bots so I can apply them to the filter?
I read an Analytics "how to" but its over my head and I'm hoping for some "dumbed down" guidance.
-
No problem, feel free to reach out if you have any other RegEx related questions.
Regards,
Chris
-
I will definitely do that for Rackspace bots, Chris.
Thank you for taking the time to walk me through this and tweak my filter.
I'll give the site you posted a visit.
-
If you copy and paste my RegEx, it will filter out the rackspace bots. If you want to learn more about Regular Expressions, here is a site that explains them very well, though it may not be quite kindergarten speak.
-
Crap.
Well, I guess the vernacular is what I need to know.
Knowing what to put where is the trick isn't it? Is there a dummies guide somewhere that spells this out in kindergarten speak?
I could really see myself botching this filtering business.
-
Not unless there's a . after the word servers in the name. The . is escaping the . at the end of stumbleupon inc.
-
Does it need the . before the )
-
Ok, try this:
^(microsoft corp|inktomi corporation|yahoo! inc.|google inc.|stumbleupon inc.|rackspace cloud servers)$|gomez
Just added rackspace as another match, it should work if the name is exactly right.
Hope this helps,
Chris
-
Agreed! That's why I suggest using it in combination with the variables you mentioned above.
-
rackspace cloud servers
Maybe my problem is I'm not looking in the right place.
I'm in audience>technology>network and the column shows "service provider."
-
How is it titled in the ISP report exactly?
-
For example,
Since I implemented the filter four days ago, rackspace cloud servers have visited my site 848 times, , visited 1 page each time, spent 0 seconds on the page and bounced 100% of the time.
What is the reg expression for rackspace?
-
Time on page can be a tricky one because sometimes actual visits can record 00:00:00 due to the way it is measured. I'd recommend using other factors like the ones I mentioned above.
-
"...a combination of operating system, location, and some other factors can do the trick."
Yep, combined with those, look for "Avg. Time on Page = 00:00:00"
-
Ok, can you provide some information on the bots that are getting through this that you want to sort out? If they are able to be filtered through the ISP organization as the ones in your current RegEx, you can simply add them to the list: (microsoft corp| ... ... |stumbleupon inc.|ispnamefromyourbots|ispname2|etc.)$|gomez
Otherwise, you might need to get creative and find another way to isolate them (a combination of operating system, location, and some other factors can do the trick). When adding to the list, make sure to escape special characters like . or / by using a \ before them, or else your RegEx will fail.
-
Sure. Here's the post for filtering the bots.
Here's the reg x posted: ^(microsoft corp|inktomi corporation|yahoo! inc.|google inc.|stumbleupon inc.)$|gomez
-
If you give me an idea of how you are isolating the bots I might be able to help come up with a RegEx for you. What is the RegEx you have in place to sort out the other bots?
Regards,
Chris
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Landing pages for paid traffic and the use of noindex vs canonical
A client of mine has a lot of differentiated landing pages with only a few changes on each, but with the same intent and goal as the generic version. The generic version of the landing page is included in navigation, sitemap and is indexed on Google. The purpose of the differentiated landing pages is to include the city and some minor changes in the text/imagery to best fit the Adwords text. Other than that, the intent and purpose of the pages are the same as the main / generic page. They are not to be indexed, nor am I trying to have hidden pages linking to the generic and indexed one (I'm not going the blackhat way). So – I want to avoid that the duplicate landing pages are being indexed (obviously), but I'm not sure if I should use noindex (nofollow as well?) or rel=canonical, since these landing pages are localized campaign versions of the generic page with more or less only paid traffic to them. I don't want to be accidentally penalized, but I still need the generic / main page to rank as high as possible... What would be your recommendation on this issue?
Intermediate & Advanced SEO | | ostesmorbrod0 -
Onpage Reviews, SEO & Traffic Uplift
Hi I wondered if anyone knew of any case studies to reinforce the importance of on page reviews for SEO & increasing traffic. I'd like to push it in my company, however it would be great to show them some results from a case study. Thank you!
Intermediate & Advanced SEO | | BeckyKey1 -
Website architecture - levels vs filters and authority loss - Enterprise SEO
Hi Everyone, I am participating in the development of a marketplace website where the main channel will be traffic via SEO. We have encountered the directories (levels) vs filters situation. 1. Does everyone still agree that if we have too many levels, authority is loss as you do down through the levels? Does everyone agree that there should be a max of 3 levels and never 4. Example 1 www.domain.com/level1/level2/level3 vs www.domain.com/level1 In theory, the content on "level 3" will have a lower DA than the content on "level1". 2. Does everyone agree that for enterprise SEO (huge marketplace websites) filters are a better idea than levels? Example 2 www.domain.com/level1/level2/level3 vs www.domain.com/filter-option1 In theory, the content on "level 3" will have a lower DA than the content on "filter-option1". Thanks so much in advance
Intermediate & Advanced SEO | | Carla_Dawson0 -
Normal that Home Page Generating Less than 4% Of Organic Traffic?
Greetings MOZ Community: My firm operates www.nyc-officespace-leader.com, a commercial real estate brokerage in New York City. Prior to the first Penguin update in April 2012, our home page used to receive about 10% or 600 of total organic visits. After the first Penguin was launched by Google organic traffic to the home dropped to maybe 5% or 200 visits per month. Since May of this year, it appears we have been penalized by Penguin 4.0 and are attempting to recover. Now our home page only generates about 140 organic visits per month, or less than 4% of organic traffic. Our home enjoyed good conversion rate, so this drop in traffic is a real loss. Does this very low level of traffic to the home page indicate something abnormal? Dropping from 10% to less than 4% is a major decline. Should we take specific steps regarding the home page like enhancing the content? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Subdomain Blog Sitemap link - Add it to regular domain?
Example of setup:
Intermediate & Advanced SEO | | EEE3
www.fancydomain.com
blog.fancydomain.com Because of certain limitations, I'm told we can't put our blogs at the subdirectory level, so we are hosting our blogs at the subdomain level (blog.fancydomain.com). I've been asked to incorporate the blog's sitemap link on the regular domain, or even in the regular domain's sitemap. 1. Putting the a link to blog.fancydomain.com/sitemap_index.xml in the www.fancydomain.com/sitemap.xml -- isn't this against sitemap.org protocol? 2. Is there even a reason to do this? We do have a link to the blog's home page from the www.fancydomain.com navigation, and the blog is set up with its sitemap and link to the sitemap in the footer. 3. What about just including a text link "Blog Sitemap" (linking to blog.fancydomain.com/sitemap_index.html) in the footer of the www.fancydomain.com (adjacent to the text link "Sitemap" which already exists for the www.fancydomain.com's sitemap. Just trying to make sense of this, and figure out why or if it should be done. Thanks!0 -
Keywords Directing Traffic To Incorrect Pages
We're experiencing an issue where we have keywords directing traffic to incorrect child landing pages. For a generic example using fake product types, a keyword search for XL Widgets might send traffic to a child landing page for Commercial Widgets instead. In some cases, the keyword phrase might point a page for a child landing page for a completely different type of product (ex: a search for XL Widgets might direct traffic to XL Gadgets instead). It's tough to figure out exactly why this might be happening, since each page is clearly optimized for its respective keyword phrase (an XL Widgets page, a Commercial Widgets page, an XL Gadgets page, etc), yet one page ends up ranking for another page’s keyword, while the desired page is pushed out of the SERPs. We're also running into an issue where one keyword phrase is pointing traffic to three different child landing pages where none of the ranking pages are the page we've optimized for that keyword phrase, or the desired page we want to rank appears lower in the SERPs than the other two pages (ex: a search for XL Widgets shows XL Gadgets on the first SERP, Commercial Widgets on the second SERP, and then finally XL Widgets down on the third or fourth SERP). We suspect this may be happening because we have too many child landing pages that are targeting keyword terms that are too similar, which might be confusing the search engines. Can anyone offer some insight into why this may be happening, and what we could potentially do to help get the right pages ranking how we'd like?
Intermediate & Advanced SEO | | ShawnHerrick0 -
Sudden drop in ranks and traffic after migrating community website into main domain
Hi, We recently moved our community website (around 50K web pages) to our main domain. It now resides as a sub-domain on our main website. e.g. Before - we had www.mainwebsite.com and www.communitywebsite.com After - we have www.communitywebsite.mainwebsite.com This change took place on July 19th. After a week, we saw 16% drop in organic traffic to mainwebsite.com. Our ranks on most of the head keywords including brand keywords have dropped. We had created 301 redirects from pages on www.communitywebsite.com before this change was made. Has anybody seen this kind of impact when domains are merged? Should we expect that within 3-4 weeks Google will be able to re-index and re-rank all the pages? Is there anything else we could do to rectify the situation? Any feedback/suggestions are welcome!
Intermediate & Advanced SEO | | Amjath0