Our organic homepage traffic just recently spiked from about a typical under 20 per weekend to about 820 -- what could be causing this?
-
Website: http://www.myinjuryattorney.com
Our homepage typically receives under 20 organic visitors per weekend, but I just checked traffic this morning, and it was at a whopping 821 for just Saturday and Sunday. It's already at 212 this morning.
I'm heavily assuming this is fake traffic as there were about 818 drop offs after visiting the homepage, an 84.41% bounce rate, and an average session duration of 5 seconds. Our typical metrics -- last weekend for example, were: 13 visitors to the homepage, 38% bounce, and an average session duration of 1 minute 26 seconds.
Does anyone know who or what could be causing this? Could it be a competitor using negative SEO of some sort? Thank you in advance.
-
Hi Rick, sorry for the hiatus, I have a couple other questions for you.
1. Have you set up conversion tracking? Has there been an increase in conversions?
2. Do you have any campaigns running? Print, broadcast, radio, etc.? Many offline campaigns cause a boost in organic searches for my clients. -
-
Hi Brett - I was able to go into this filter and I didn't see anything out of the ordinary.
-
Hi Rick,
Since I haven't seen a response yet, I'm assuming I wasn't clear enough in my explanation so I went into an unfiltered view for one of my clients and found some ghost spam, then skitched it so you could see how to get there and examine it yourself on your website. Hope this helps!
-
Not just yet. Click on the secondary dimension drop down bar and type in hostname, or find it under the behavior bar. You can also look at just google traffic by clicking on Google first then setting the hostname as the secondary dimension. It should become apparent at that point if you have a lot of bots spoofing your traffic with a fake source.
-
Hi Brett - thank you! Do I have this set up right? I'm just seeing normal sources from what I can tell. https://www.screencast.com/t/t9VW5tSz
-
Yes, because this filter is based on the hostname. If a bot is spoofing the source but does not have a valid hostname (and most will not) then it will be filtered out by the include filter. Go into your GA data, go down to the source/medium report under acquisitions and set the secondary dimension to hostname.
If you're seeing something like (not set) next to Google/Organic traffic in the source then that's spam. I've got some in my unfiltered views as well. From the article I sent you:
"On the other hand, valid traffic will always use a real hostname. In most of the cases, this will be the domain. But it also can also result from paid services, translation services, or any other place where you've inserted GA tracking code."
So just make sure you compile a list of all the valid hostnames for your website and you should be fine.
-
Hi Brett,
Thank you for the info. Would all of this still apply if the traffic is considered organic and not referral?
-
Hi Rick,
Try checking your traffic against the secondary dimension "hostname". If a large number appear to be invalid hostnames then you've got yourself an answer. Referral traffic, also known as ghost spam, can be removed with an include filter. Moz wrote a great guide on how to do this here: https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter
If you're at all concerned that the traffic could be ghost spam and you don't have this filter in place, then an easy means of checking is to implement the filter on a test view and see how it impacts your data. Just make sure you create a new view to test it on first, because I had a client accidentally exclude all of his valid hostnames and lost every last bit of actionable data.
Hope this helps!
-
Have you checked the landing pages that relate to the keywords? In that case you would hopefully be able to see what kind of pages are trending at the moment and increasing your traffic. A big increase in traffic might have an influence, but in the end 800 searches more daily are not that much.
-
I noticed a few months ago, that type of traffic was not just showing up under referral but also under organic in GA. As far as i am concerned, just another problem plaguing GA/GWMT.
Matt
-
Hi Martijn,
I'm checking now and for some reason it's not reflecting the high # of visitors. All of the queries also seem normal, and it's showing that none have been repeated over 5 times. There are however a ton of different, but pretty normal ones appearing. Any additional insight given that info? Thanks!!
-
Hi Matt, thanks for the quick answer! All of this traffic is actually showing up under our organic rather than referral
-
Sounds like you are experiencing "Referral Spam". Have you checked the sources in Google Analytics? It is essentially a spammy way of advertising domains and services.
Here are a few links to help you understand and fix the issue:
- https://moz.com/blog/how-to-stop-spam-bots-from-ruining-your-analytics-referral-data
- If you have GA: https://support.google.com/analytics/answer/1034842?hl=en
Good Luck,
Matt
-
Hi Rick,
If you connected Google Search Console to your site you should be able to see in the Search Analytics data what kind of keywords did trigger the traffic. It could always be fake traffic but sometimes you just get lucky with certain keywords that you appear to rank for all of a sudden.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I create a new Website to promote just one set of services from a list of several services?
Hi, I have a 10 years old website, where I promote all my services - around 30 of them under 5 main categories. For example, my current website promotes these services. A service - with a1, a2, a3 services B service - with b1, b2, b3 services C service - with c1, c2, c3 services D service - with d1, d2, d3 services E service - with e1, e2, e3 services Now I want to promote just "A service" with its sub-services into a separate website, as that service is in demand now and also those keywords should be my main keywords. I want to connect my old website with the new one, to increase the trust among users. Can I do this? I hope I am not violating any Google rules by doing this. Please help with suggestions. Thanks. Jessi.
White Hat / Black Hat SEO | | Sudsat0 -
Should I 301 redirect my old site are just add a link to my new site
I used to offer design and web services on a site that is current blank (no content, no links). My questions is should I add a little bit of content, maybe a brief explanation with a link to my new site. Or should I just add 301 redirect. This is purely a question of what is better for SEO and ranking for my new site (not a branding question).
White Hat / Black Hat SEO | | Tyrell0 -
Direct Traffic has Dropped 48% to Last Year
Since February of 2013 our organic traffic at http://www.weddingshoppeinc.com had been declining. We were able to get traffic back up to par with numbers from the previous year by December of 2013. In March of 2014 our direct traffic took a major hit and hasn’t improved. We know our mobile traffic is part of the problem, but the issue has affected traffic from desktop and mobile devices. Is this an organic traffic problem, or is our decrease in direct traffic coming from somewhere else? Has anyone else seen this issue, or does anyone have advice? Here is what we’ve already looked into and updates to note: Before this issue, when we compared organic and direct traffic, direct was usually half of what organic was (i.e., if organic was at 10 visitors, direct was at 5). However organic traffic has followed normal trends and direct has dropped. In August we updated our .net code to MVC to drop our first byte from 1,700 to 300 milliseconds. However, if you look at our m. site, it’s around 1,000 milliseconds. We changed our SEO strategy in May to follow best practices. We’ve been rewriting old content. We haven’t ever done any black hat SEO, just have some old blogs from 2010-2012 that have too many keywords. These are getting edited. In March we moved our images to a CDN for our images. We’re currently working on server errors and broken links, but nothing significant changed around March to affect our traffic. Very recently, our web developers said that they believed our direct traffic had been getting tracked wrong in Google Analytics prior to March 2014. However they think they fixed the issue in a March push. We've taken this theory into account, but we also see a drop in revenue at the time of their push that correlates with the drop in traffic, so we know there’s a bigger issue. Any input you can provide would be greatly appreciated!
White Hat / Black Hat SEO | | JimmyFritz1 -
Subtle On-site Factors That Could Cause a Penalty
It looks like we have the same penalties on more than one ecommerce site. What subtle on-site factors can contribute to non-manual penalty, specifically rankings slowly going down for all short tail keywords? And what does it take to pull yourself out of these penalties?
White Hat / Black Hat SEO | | BobGW0 -
Google says 404s don't cause ranking drops, but what about a lot of them
Hello, According to Google here, 404s don't cause rankings to go down. Our rankings are going down and we have about 50 or so 404s (though some may have been deindexed by now). We have about 300 main products and 9000 pages in general on this Ecommerce site. There's no link equity gained by 301 redirecting the 404s. A custom 404 page has been made linking to the home page. There's nothing linking to the pages that are 404s Provided that no more 404s are created, can I just ignore them and find the real reason our rankings are going down?
White Hat / Black Hat SEO | | BobGW0 -
Can I just delete pages to get rid of bad back-links to those pages?
I just picked up a client who had built a large set of landing pages (1000+) and built a huge amount of spammy links to them (too many to even consider manually requesting deletion for from the respective webmasters). We now think that google may also be seeing the 'landing pages' as 'doorway pages' as there are so many of them 1000+ and they are all optimized for specific keywords and generally pretty low quality. Also, the client received an unnatural links found email from google. I'm going to download the links discovered by google around the date of that email and check out if there are any that look specifily bad but I'm sure it will be just one of the several thosand bad links they built. Anyway, they are now wanting to clean up their act and are considering deleting the landing/doorway pages in a hope to a. rank better for the other non landing/doorway pages (Ie category and sub cats) but more to the crux of my question.. b. essentially get rid of all the 1000s of bad links that were built to those landing/doorway pages. - will this work? if we just remove those pages and use 404 or 410 codes will google see any inbound (external) links to those pages as basicly no longer being links to the site? or is the TLD still likely to be penilized for all the bad links coming into no longer existing URLs on it? Also, any thoughts on whether a 404 or 410 would be better is appreciated. Some info on that here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=64033 I guess another option is the disavow feature with google, but Matt Cutts video here: http://www.youtube.com/watch?v=393nmCYFRtA&feature=em- kind of makes it sound like this should just be used for a few links, not 1000s... Thanks so much!!!!
White Hat / Black Hat SEO | | zingseo0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0 -
IP-Based Content on Homepage?
We're looking to redesign one of our niche business directory websites and we'd like to place local content on the homepage catered to the user based on IP. For instance, someone from Los Angeles would see local business recommendations in their area. Pretty much a majority of the page would be this kind of content. Is this considered cloaking or in any way a bad idea for SEO? Here are some examples of what we're thinking: http://www.yellowbook.com http://www.yellowpages.com/ I've seen some sites redirect to a local version of the page, but I'm a little worried Google will index us with localized content and the homepage would not rank for any worthwhile keywords. What's the best way to handle this? Thanks.
White Hat / Black Hat SEO | | newriver0