90% traffic loss. Pandalized?
-
Happened several times during this year. After launch as soon as we reach exactly 100K uniques Google kills 90% of the traffic. Then sudden recovery (pretty much without any action from us) after several weeks not connected to any algo updates/refresh.
No warnings. No malware. WMT clean as a baby. Only old good whitehat SEO. Not even close to the edge of wrongdoing:)
This time it happens again Aug. 22nd right after Panda 3.9.1. What is different now same exact date Bing traffic went down as well.
Need advice:)
-
Irving, should we also submit removal request for these pages?
Thank you
-
Already noindexed and followed. I guess it will take next Panda data refresh to see the results. Thank you Irving.
-
Canonical tags are the least of the issues. I would focus on beefing up the category pages. In the meantime, use the noindex,follow robots tag on those pages.
-
Local - My site was severely hit by Panda 3.5 and Penguin 1.0. Bing results held steady. Eventually a slight decline in Bing but I attribute it to the loss of the FB Likes that resulted from my taking a break in the "community" (think "Free Beer") and taking a break from developing IBL's, new postings, etc.
-
Already did. Think we can adjust few things.
Weird that Bing traffic went down the same way as Google. It's Google's Panda afterall so does that mean that rumors were true and Bing "steals" Google results to add to it's own SERPs?
-
Not sure I understood this part: "You should also have canonical tags on every page."
Thank you!
Edited: Ah I see what you mean. Unfortunately we can't. Pages may look similar but locations are completely different. Happens because some locations may have almost identical set of retail chain stores (obviously same products in same stores...)
-
Take the quiz at http://www.mytrafficdropped.com/
-
You really need to look at all your pages especially the /catalog pages and see if they can be considered quality content or are they are just a bunch of links. I would noindex,follow all those pages until you can spruce them up. Basically, your site looks like a big link farm. I also found serious SEO infractions and formatting issues with your meta tags and microformats. You should also have canonical tags on every page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Huge drop in rankins, traffic and impressions after changing to CloudFlare
Hi there, In October, one of our customer's programmer made a change on their website to optimize its loading speed. Since then, the all the SEO's metrics has dropped. Apparently, the change was to move to CloudFlare and to add Gzip compression. I was talking with the programmer and he told me he had no idea why that happened. Now comes 5 months later and the SEO metrics havn't come back yet. What seems so wierd is that two keywords in particular had the most massive drop. Those two keywords were the top keywords (more than 1k of impressions a month) and now its like there is no impressions or clics at all. Did anyone had the same event occur to them? Do you have any idea what could help this case?
Technical SEO | | H.M.N.0 -
Organic search traffic has dropped by 35% since 18 September, we don't know why.
Organic traffic to our website has dropped 35% since 18 September 2017 to date. From 1 January to 18 September 2017 organic traffic was up by just under 1% over all (Google up by 1.32%). Paid search traffic over the same time has remained steady. There is nothing we can think of that we've done that has caused the drop. We had an issue with Google page speed test failing when running a test but we resolved this issue on 20 November and in that time we've seen an even greater drop (44% in the last week). The drop is seen across the 3 main search engines, not just Google, which points toward something we've done, but as mentioned, we can't think of any significant change we made in September that would have such negative effects. There is little difference across devices. Is anyone aware of a significant event in September in the search engine world that may have influenced our organic traffic? Any help gratefully received.
Technical SEO | | imaterus0 -
Drop in traffic, spike in indexed pages
Hi, We've noticed a drop in traffic compared to the previous month and the same period last year. We've also noticed a sharp spike in indexed pages (almost doubled) as reported by Search Console. The two seemed to be linked, as the drop in traffic is related to the spike in indexed pages. The only change we made to our site during this period is we reskinned out blog. One of these changes is that we've enable 'normal' (not ajax) pagination. Our blog has a lot of content on, and we have about 550 odd pages of posts. My question is, would this impact the number of pages indexed by Google, and if so could this negatively impact organic traffic? Many thanks, Jason
Technical SEO | | Clickmetrics0 -
Why has my search traffic suddenly tanked?
On 6 June, Google search traffic to my Wordpress travel blog http://www.travelnasia.com tanked completely. There are no warnings or indicators in Webmaster Tools that suggest why this happened. Traffic from search has remained at zero since 6 June and shows no sign of recovering. Two things happened on or around 6 June. (1) I dropped my premium theme which was proving to be not mobile friendly and replaced it with the ColorMag theme which is responsive. (2) I relocated off my previous hosting service which was showing long server lag times to a faster host. Both of these should have improved my search performance, not tanked it. There were some problems with the relocation to the new web host which resulted in a lot of "out of memory" errors on the website for 3-4 days. The allowed memory was simply not enough for the complexity of the site and the volume of traffic. After a few days of trying to resolve these problems, I moved the site to another web host which allows more PHP memory and the site now appears reliably accessible for both desktop and mobile. But my search traffic has not recovered. I am wondering if in all of this I've done something that Google considers to be a cardinal sin and I can't see it. The clues I'm seeing include: Moz Pro was unable to crawl my site last Friday. It seems like every URL it tried to crawl was of the form http://www.travelnasia.com/wp-login.php?action=jetpack-sso&redirect_to=http://www.travelnasia.com/blog/bangkok-skytrain-bts-mrt-lines which resulted in a 500 status error. I don't know why this happened but I have disabled the Jetpack login function completely, just in case it's the problem. GWT tells me that some of my resource files are not accessible by GoogleBot due to my robots.txt file denying access to /wp-content/plugins/. I have removed this restriction after reading the latest advice from Yoast but I still can't get GWT to fetch and render my posts without some resource errors. On 6 June I see in Structured Data of GWT that "items" went from 319 to 1478 and "items with errors" went from 5 to 214. There seems to be a problem with both hatom and hcard microformats but when I look at the source code they seem to be OK. What I can see in GWT is that each hcard has a node called "n [n]" which is empty and Google is generating a warning about this. I see that this is because the author vcard URL class now says "url fn n" but I don't see why it says this or how to fix it. I also don't see that this would cause my search traffic to tank completely. I wonder if anyone can see something I'm missing on the site. Why would Google completely deny search traffic to my site all of a sudden without notifying any kind of penalty? Note that I have NOT changed the content of the site in any significant way. And even if I did, it's unlikely to result in a complete denial of traffic without some kind of warning.
Technical SEO | | Gavin.Atkinson1 -
Take a good amount of existing landing pages offline because of low traffic, cannibalism and thin content
Hello Guys, I decided to take of about 20% of my existing landing pages offline (of about 50 from 250, which were launched of about 8 months ago). Reasons are: These pages sent no organic traffic at all in this 8 months Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content) Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50.. I also realized that for some keywords the landing page dropped out of the top 50, another landing page climbed from 50 to top 10 in the same week, next week the new landing page dropped to 30, next week out of 50 and the old landing pages comming back to top 20 - but not to top ten...This all happened in October..Did anyone observe such things as well? That are the reasons why I came to the conclustion to take these pages offline and integrating some of the good content on the other similiar pages to target broader with one page instead of two. And I hope to benefit from this with my left landing pages. I hope all agree? Now to the real question: Should I redirect all pages I take offline? Basically they send no traffic at all and non of them should have external links so I will not give away any link juice. Or should I just remove the URL's in the google webmaster tools and take them then offline? Like I said the sites are basically dead and personally I see no reason for these 50 redirects. Cheers, Heiko
Technical SEO | | _Heiko_0 -
Fixing a website redirect situation that resulted in drop in traffic
Hi, I'm trying to help someone fix the following situation: they had a website, www.domain.com, that was generating a steady amount of traffic for three years. They then redesigned the website a couple of months ago, and the website developer redirected the site to domain.com but did not set up analytics on domain.com. We noticed that there was a drop in traffic to www.domain.com but have no idea if domain.com is generating any traffic since analytics wasn't installed. To fix this situation, I was going to find out from the developer if there was a good reason to redirect the site. What would have prompted the developer to do this if www.domain.com had been used already for three years? Then, unless there was a good reason, I would change the redirect back to what it was before - domain.com redirecting to www.domain.com. Presumably this would allow us to regain the traffic to the site www.domain.com that was lost when the redirect was put in place. Does this sound like a reasonable course of action? Is there anything that I'm missing, or anything else that I should do in this situation? Thanks in advance! Carolina
Technical SEO | | csmm0 -
Directing traffic to subdomain
Hi everyone, For this question, please note that we will be directing traffic using a load balancer (an Amazon ELB, to be specific) rather than using a 301 redirect. The question: Will the SEO ranking of links to pages be negatively impacted by directing traffic to servers with a different hostname (or subdomain) within mycompany.com? For example, we would like to have www.mycompany.com load balanced between host1.mycompany.com and host2.mycompany.com. Many thanks for your input! Jay
Technical SEO | | SeoExpansion0