100% sudden drop in traffic
-
Hi Guys
I've just had an awful fright as Google Analytics is saying my website http://www.clintonpower.com.au has dropped to nil traffic since the 14th September.
I've checked Google webmaster tools and all is OK with no malware or Google blacklisting that I can see.
Any ideas how I can find out what's happened? This is super scary because 99% of my business comes from Google.
I really have no idea what to do next or how to investigate the problem. Is there a way I can submit a query to Google through webmaster tools?
Thanks in advance!
Clinton
-
Thank you Gerd for your wonderful advice in this thread. This thread taught me a great deal and it's one of the reasons I love SEOMoz and the Q&A forum so much.
-
It's been a pleasure Clinton.
It will now be interesting to see if the non-working GA just was a false signal in traffic or if you still have other issues. Let us know - you should see changes within the next 1-2 days.
-
Thank you Gerd.
I uninstalled my GA plugins and reinstalled them.
GA is now saying I'm receiving data and there has been a slight increase in GA stats in the last day.
So you are correct that the 100% drop in traffic is due to broken GA code in site.
I really appreciate your time in responding to me and my problem.
-
The "Tracking Status" should say "Receiving data". Have a look at the attached image (and also this URL http://support.google.com/analytics/bin/answer.py?hl=en&answer=1008083).
Your plugin is using the traditional (old) GA snippet. The async snippet is the preferred option (been around for more than 2 years now) as it will track more accurately.
Just doing a code comparison, your plugin does not load the ga.js (notice in the image attached the missing document.write line?).
My suggestion would be to use your Analytics plugin and replace it with another one - "All in one SEO pack" or "Google Analytics for Wordpress" comes to mind.
This will sort out your tracking. Your SERPs are another (unrelated issue). Your drop in traffic you can verify if you look at your server access log. If there is absolutely no activity (or a sudden drop compared to previous month, then you have issues). I do suggest to run something like Webalizer/Awstats over your logs so that you at least can get an independent view on traffic.
-
Thanks Gerd. It looks like you are correct and it is not receiving data. It just says tracking installed- see screenshot.
I did find my subdomain www.clintonpower.com.au/consulting, which is a separate WP installation had a different tracking code, and I'm guessing it should be the same as the main domain. I have now corrected this.
Aside from analytics, SEOmoz is also saying 100% drop in traffic, but I'm unsure if SEOmoz is taking this from GA or their own crawl.
Thank you for clarifying the redirect and I do see the 156 pages indexed now.
-
Well, there are two problems: Your GA code does not work. If you turn on the console in Chrome you will see those errors. The broken GA code is the reason for no data feeding into GA. Also, whatever plugin you use to generate the GA code, it produces GA code which is wrong/outdated (see my earlier response regarding the GA async script).
The second problem is that when you guys ran a site:www.clintonpower.com.au, this is really wrong. It should be site:clintonpower.com.au - because:
- When you access www.clintonpower.com.au it does a 301 to clintonpower.com.au.
- Your sitemap content points to clintonpower.com.au and not www.clintonpower.com.au
I think you guys are looking at too many issues incorrectly. Fix the GA tracking first, because right now it does not track anything. You can check this in GA - Go to your GA profile and then select the Tracking Code tab - it should show in the middle of the page "Tracking Status: Receiving Data" - I would be surprised if it does show this.
With regards to indexing - your sitemap contains 63 URLs and Google has indexed 156 (site:clintonpower.com.au)
-
I use the Ultimate Google Analytics plugin, so there would be no change in the code because I don't add it manually.
-
Yes I think so Dana. I am using Ultimate Google Analytics plugin, so nothing has changed there.
-
OK. You're onto something Dana. That is not right because there should be many pages indexed. I have at least 30+ blog posts alone.
So I'm note sure what this means, which is one the search results you've pointed ou:t
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Sitemap: http://clintonpower.com.au/sitemap.xml.gz
Ideas anyone?
[user-agent.png](http://clintonpower.com.au/consulting/wp-content/uploads/2012/09/user-agent.png)
-
The robots.txt is not the problem. The sudden drop in recorded traffic in GA is due to the script error.
I would have a very close look at GWMT - everything under health (crawl errors, crawl stats, index stats), traffic and optimization. Have a look if any errors started spiking. Also look at what your index ratio in your sitemap is - this should typically be around 60-80% (i.e. if your sitemap contains 100 URLs you should have at least 60 in the index).
The faulty GA code would not affect SERPs. Perhaps other changes caused temporary crawl errors (you will pick this hint up in GWMT).
I would fix the GA issue, then regenerate the sitemap and manually resubmit it via GWMT. Wait for 2-3 days (or until your content has been crawled).
With the GA being broken, you can only rely on your server access logs to determine if your organic traffic changed.
-
Gerd, for all of us less technical folks out here, can you explain why this was effecting the number of pages Clinton had indexed in Google SERPS?
-
Trust this suggestion, Clinton. Gerd is really smart and probably way more technically smart than me.
-
Your issue is that the analytics script throws Javascript errors i.e. "Uncaught ReferenceError: _gat is not defined"
Go into GA and the setup/admin function for your domain and then "Tracking Code" and then replace your current (old GA code) with the proper async analytics code.
-
The problem isn't the analytics code. The problem is the number of pages indexed in Google. I can say that, definitely.
[Updated] Many thanks to Gerd for his correct analysis of the problem. I stand corrected! I was looking at the wrong URL in site: I looked at site:http://www.clintonpower.com.au which only shows three results. Without the "www" yes shows 156 indexed pages.
I very much appreciate Gerd's expertise!
My apologies, Clinton for perhaps leading you to look at the wrong thing.
-
That was actually my first thought, so I went to site:http://www.clintonpower.com.au
According to the results I can see, there are only three results listed. I am assuming your site has way more than three pages?
One of the three results is a link that leads to this page:
User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Sitemap: http://clintonpower.com.au/sitemap.xml.gz
Hope we can all help you piece this together!
-
Thank you. That's worth checking out.
But I noticed this problem by just doing searches for my keywords. I noticed that I have dropped out of the SERPS and I was consistently on the first page for some of my keywords.
I'm aware of the duplicate page content issues for categories and tags.
For the moment, I'm unchecking nofollow and noindex for only the subpages and categories and seeing what happens.
-
Is it possible that the Analytics script was altered or removed? Wouldn't it at least be showing some direct traffic?
It's showing 156 pages indexed in Google and when I searched a couple of your page titles your site showed up at the top. I would monitor the search queries in WebmasterTools and see if anything looks different.
Be careful about indexing the category and tags archives since they can cause duplicate content... I learned the hard way
-
Yes, for starters uncheck all the boxes. Although I am unfamiliar with this specific platform, it sounds like a setting gone awry. The worst problems are often from the simplest mistakes.
After you uncheck all the boxes, spend a week monitoring your GWT reports. You'll know very quickly where there might be pages you don't want crawled or indexed, like dynamic search results pages on am e-commerce site.
It's great that you have a developer for help.
You might repost another Q & A question here regarding robots.txt for the specific platform you are using. You might very well find some specific help on which boxes to select and which boxes to leave unchecked.
I feel for you! I had a site disappear many many years ago because my host was in another company and sold his hosting company. I lost my entire business, so I really do understand that horrible feeling! I really hope something I've given you here helps. Cheers!
-
I've just looked in the Thesis site options and noticed that almost all the boxes under robots.txt for no follow and no index were turned on, except for a couple, such as subpages. Could it be related and should I just uncheck them all so everything is indexed and followed.
I'm not sure if this is the default setting because I've never edited the robots.txt settings.
-
That sounds very possible. It has to be something as simple as that to cause a radical 100% drop in traffic.
I've got my developer looking into it and will have some answers soon, I hope. I'll update you.
-
You are very welcome. Have someone look at your robots.txt file. It looks possible to me that you are disallowing your entire sitemap. If that's not it, my guess is it was a recent back end change. Can you uninstall Premise, undo anything else you've done in the last 10 days or so. Resubmit your sitemap and see if that fixes it. Then, one by one reinstall, or redo the changes you made, but wait a couple of days between each change.
I think something you have done is telling Google bot that you don't want your pages indexed.
-
Hi Dana
Not that I'm aware of, but that's at least something I can have checked out straight away.
I have installed Premise getting ready to sell a product. Not sure if that would conflict with those files.
Thank you!
-
Hi Clinton,
I'm not a super-technical SEO, but it looks like maybe you have something going on in your htaccess file or robots.tct file that is telling google not to include any pages from your site.
Have you changed anything in the backend settings recently?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Increase in Direct Traffic plus Bounce Rate rise for all traffic sources
Hello, I work for an agency and we have seen a big rise in bounce rate for 4 of our clients which happened on the exact same day. This rise on bounce rate is across all traffic sources. We are also seeing a big increase in direct traffic, starting on the same day. Is it possible for bot traffic to affect the bounce rate of all other traffic sources? We have ruled out double reporting in GA but can explain how the bounce rate has increased for all traffic sources. How is this linked to the rise in direct traffic (in some cases as high as 500%)? Thanks
Reporting & Analytics | | jenallen0 -
Direct / (none) Spam Traffic Help
In July 2015, we experienced an over 1,000% increase in traffic and it has remained like that ever since. It's all spam traffic and I have no clue how to get rid of it. I added in your typical .htaccess blocks from known culprits with little to no effect. Read up on Ghost traffic and applied filters to no effect. The spam is completely distributed as far as I can tell both geographically as well as by network providers. Where once we had pretty decent bounce rates of around 50%, now, since all my Analytics data is meaningless - it's around 90%. I could apply a filter but beyond my GA account providing no insights, I'm also concerned about the increased use of server resources. I'd ideally like to stop the traffic completely. The only distinguishing feature of the traffic that I have been able to determine is browser size. Comparing June 2015 to July 2015 we saw the following: Browser size visits: 620 x 460 = 6,828 vs 0, 610 x 450 = 175 vs 0, 1330 x 630 = 71 vs 1, 1890 x 940 = 67 vs 0, 780 x 580 = 58 v 5. Other than that, I can find no unifying theme to the traffic beyond being traffic hitting our homepage and having no medium. Nothing special that I am aware of happened in July. We didn't do any sort of...really anything. We did have our network compromised by ransomware in the beginning of June, which we promptly ignored and restored backups - at no point did we try to contact the criminals, but I am doubtful there is any connection considering that our website is remotely hosted. If anyone has any suggestions or has seen anything like this before, please let me know. spam-traffic.jpg
Reporting & Analytics | | Nivik230 -
Free Media Site / High Traffic / Low Engagement / Strategies and Questions
Hi, Imagine a site "mediapalooza dot com" where the only thing you do there is view free media. Yet Google Analytics is showing the average view of a media page is about a minute; where the average length of media is 20 - 90 minutes. And imagine that most of this media is "classic" and that it is generally not available elsewhere. Note also that the site ranks terribly in Google, despite having decent Domain Authority (in the high 30's), Page Authority in the mid 40's and a great site and otherwise quite active international user base with page views in the tens of thousands per month. Is it possible that GA is not tracking engagement (time on site) correctly? Even accounting for the imperfect method of GA that measures "next key pressed" as a way to terminate the page as a way to measure time on page, our stats are truly abysmal, in the tenths of a percentage point of time measured when compared with actual time we think the pages are being used. If so, will getting engagement tracking to more accurately measure time on specif pages and site signal Google that this site is actually more important than current ranking indicates? There's lots of discussion about "dwell time" as this relates to ranking, and I'm postulating that if we can show Google that we have extremely good engagement instead of the super low stats that we are reporting now, then we might get a boost in ranking. Am I crazy? Has anyone got any data that proves or disproves this theory? as I write this out, I detect many issues - let's have a discussion on what else might be happening here. We already know that low engagement = low ranking. Will fixing GA to show true engagement have any noticeable impact on ranking? Can't wait to see what the MOZZERS think of this!
Reporting & Analytics | | seo_plus0 -
Traffic drop after analytics troubles
Hi For two weeks we had an artifical low bouncerate & high pageviews/visit in our Analytics reporting. The day we corrected the bug in Analytics - our bouncerate & pageviews/visit returned to normal levels - however we saw our search traffic go down massively(-50% in sessions). The bug in the Analytics was caused by a second Analytics tag which was included in an external script which interfered with our own tag. The drop in traffic occurred just after the removal of the second script (which was only on our site for two weeks). We didn't touch our own tagging - and there were no technical changes on the site during this period, and there is no seasonal trend which could explain the sudden drop of traffic. We double checked our tagging - and the analytics tag is present & working on all the pages of our site. On the organic traffic report from analytics you can clearly see the when the troubles with analytics started & ended (artificial low bounce rate) - and that the traffic drop starts right after the reporting issue ended. Webmastertools also indicated a lower number of views/clicks, but not to the massive 50% drop. Is it possible that Google uses the measurements from Analytics for it's SERP's? Or should there be another reason, and where should we start looking? Appreciate your help! e5GmoMM
Reporting & Analytics | | DirkC0 -
Universal Analytics - is it awarding traffic incorrectly ?
A few days ago deployed a new site. A funny thing happened within one day. Google Analytics commenced reporting Organic traffic as lifting by 25%. At the same time Google Analytics commenced reporting that direct traffic dropped by 25%. All other traffic sources remained at the same level. This is all being reported through GA Universal. Has anyone experienced this before? Any advice is appreciated. Also noted that with Google Webmaster is reporting, since the day of the new site deployment, a 40% drop in impressions and click. Reported referrals on this deployment day also went through the roof. All of the Google tools I am depending on are not supporting my discovery of a solution to this. If anyone has been through this on universal analytics or traditional analytics please help! Offering lots of good karma points in return. Cheers.
Reporting & Analytics | | Oxfordcomma0 -
Loss of Google referral traffic after server move / CMS move
I have a client that changed from MoveableType to Wordpress. He also changed from a dedicated server to WP Engine. He may have blocked search engines for a week or two, so his organic traffic is down but only by 25%. He's 301 redirecting all of the old pages. The mystery is that his referral traffic from Google is down 90%. It's a popular blog, so that's thousands. It's been going on a month now. Anyone seen this before?
Reporting & Analytics | | Hyper-Dog0 -
Hour of the day that my analytics goals are being triggered within the all traffic report.
I am trying to identify the hour of the day that particular keywords (organic and PPC) are triggering my goals. Ideally I'd like to be able to use the all traffic report with the secondary dimension set as keyword. Hopefully I'm missing something simple, thanks all. Mark
Reporting & Analytics | | mde9110