Massive drop in Google traffic after upping pagecount 8-fold.
-
I run a book recommendation site -- Flashlight Worthy.
It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage".
It's been online for 4+ years.
Historically, it's been made up of:
-
a single home page
-
~50 "category" pages, and
-
~425 "book list" pages.
(That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.)
On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight.
If an Author has more than one book on the site, the page shows every book they have on the site, such as this page:
http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805
..but the vast majority of these author pages have just one book listed, such as this page:
http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116
Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries.
And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google.
(Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...)
Here's the problem:
For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable.
And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today.
And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem...
So:
1. Do you think the drop is related to my upping my pagecount 8-fold overnight?
2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority).
3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors.
What else?
Thanks so much, help is very appreciated.
Peter
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. -
-
Thanks for updating on your findings. That is interesting, but glad you got it sorted.
-
And now another update. About 1 week after removing all the new content, search traffic came right back to where it was. So clearly Google was mad at me. And now they're not. Sigh. Stupid Google.
-
UPDATE: I've removed all the new pages from my site in hopes that it will turn around my losss is search traffic. I'd still like an expert opinion on the matter in general.
-
Indeed, I looked at Webmaster Tools -- no duplicates.
As far as Canonical, while I know and love that feature, I don't think it's relevant here. These pages aren't different URLs for the same content -- they're segments of content taken from different pages, stitched together in a new and useful way.
I think, if this is the problem, that it's the fact that 95% of the new pages only have 1 item of content on them and it's a piece of content that appears elsewhere on the site.
-
Hi Peter
I agree Matt Cutts wasn't very clear as providing a solid number, but I actually consider what he said about relativity. "..if your site was 1 day .. um you know nothing, then the next day there is 4 million pages in our index" seems to me like he was hinting a percentage rather then a hard number. In your case you increased your site by over a 1000% with no new content.
From a useability standpoint it maybe awesome, from an SEO standpoint it may not. I can't say for sure the best way to handle it, but if it was me I would not throw away the benefit to my users, I instead would look to see if I can canonicalize any of these pages to prevent lower the burden on Google to try and differentiate one page from another.
Have looked at your Google Webmaster Tools to see if they are seeing some pages as duplicates?
-
Don, thatnks for replying. In answer to your questions:
-- Yes we added all the pages to the sitemap.
--As far as the content being unique, no -- not one word on any of the pages is unique. But the aggregation of the information onto those pages is unique and helpful to the end user. For example, say you had a site full of movies that won Oscars -- winners of 2010, all movies that won Best Director, all movies that won best Music, etc. Now imagine you'd like to see all the Tom Hanks movies that have won Oscars. There are a number of Tom Hanks movies scattered across the lists but there's no easy way to see them all at once. So generating a list of Tom Hanks movies that won Oscars is easy and useful. Only problem is, about 95% of the time when you generate such lists, you'll generate them for actors that were only in 1 Oscar-winning movie... hence a bunch of pages that are of little use. But why would that hurt traffic to all the pages that HAVE been of use for the last several years?
That Matt Cutts video was interesting... but I'm not sure if there's a clear answer there. he said 100+ pages at once is fine. But 10,000... maybe not. So what about 4,500?
-
Hi Peter,
According to Matt Cutts as long as the content is quality / good / unique you should not have been dinged.
You watch his answer to a very similar question on youtube here.
Now what is interesting is you went from 500 pages to 4000 pages. That is a huge update in terms of what your site has been offering so there maybe something going on there.
Did you submit all these page in a sitemap to Google? and by nature of these pages was the content unique or snippets of the inner content?
I will add a story about our how I handled a similar situation and maybe give you something to ponder. We have an o-ring size look up section on our site, the urls being generated are dynamic and number in the thousands, due to the combination of sizes, materials, and hardness. I did not tell Google about these links in the sitemap, rather just put a link to 8 main materials in the sitemap and then let Google discover the dynamic urls on their own.
After 6 months I noticed that Google was actually treating many of the deep pages as duplicate content, so I used rel='canonical" to direct the juice to the top material pages. Our traffic and SERP ratings went up for these pages.
I tell that to illustrate what I learned, having more pages isn't always good, in my case a nitrile as568-001 oring page isn't that different from a nitrile as568-002 oring page, and while they are certainly different sizes you can find information on either one from the nitrile as568 page. The smart thing I did was not flooding Google with thousands of new pages, the dumb thing I did was not canonicalizing the deep pages to begin with.
I will be interested in what others have to say on this subject, and I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google says Geolocation Redirects Are Okay - is this really ok ?
Our aim is to send a user from https://abc.com/en/us to** https://abc..com/en/uk/ **if they came to our US English site from the UK So we came across this document - https://webmasters.googleblog.com/2014/05/creating-right-homepage-for-your.html We are planning to follow this in our international website based on the article by google : automatically serve the appropriate HTML content to your users depending on their location and language settings. You will either do that by using server-side 302 redirects or by dynamically serving the right HTML content. Will there be any ranking issues/ penalty issue because of following this or because of 302 redirects ? **Another article - **https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
White Hat / Black Hat SEO | | NortonSupportSEO0 -
JavaScript encoded links on an AngularJS framework...bad idea for Google?
Hi Guys, I have a site where we're currently deploying code in AngularJS. As part of this, on the page we sometimes have links to 3rd party websites. We do not want to have followed links on the site to the 3rd party sites as we may be perceived as a link farm since we have more than 1 million pages and a lot of these have external 3rd party links. My question is, if we've got javascript to fire off the link to the 3rd party, is that enough to prevent Google from seeing that link? We do not have a NOFOLLOW on that currently. The link anchor text simply says "Visit website" and the link is fired using JavaScript. Here's a snapshot of the code we're using: Visit website Does anyone have any experience with anything like this on their own site or customer site that we can learn from just to ensure that we avoid any chances of being flagged for being a link farm? Thank you 🙂
White Hat / Black Hat SEO | | AU-SEO0 -
Still seeing a terrible rank drop after last algo update?!
I'm still stumped as to why the ranking has gone so poor on a whitehat site. (see attached image) As you can see we've steadily been improving the ranking over the last 6+ months and then got hit with a massive change this month... I can't physically see any issues and Moz isn't reporting anything negatively that would have such a major effect.. Like not as if the drops were subtle... they've all gone into the 50+ section! Any insights into what may have changed in the latest algo update would be appreciated?! S0sD7d8.png
White Hat / Black Hat SEO | | snowflake740 -
Rank Drop Possibly due to links but no warning in GWT
Hello, We've been experiencing rank drop in all major keywords for the past 9 months. I've had different people say different things here at Moz about how backlinks effect rank drop. Brilliant answers, but different opinions. Nothing is showing up in GWT for this site. Here's the backlink breakdown: 72 linking root domains. 20 of those are blogs. These blogs have no backlinks in and of themselves, and were created originally as easy links. Not white hat stuff. Three additional root domains are still paid links in this profile, though all but one was made to look editorial. The one that doesn't look editorial has links sprinkled throughout their website, among other paid links. The rest of the linking root domains (49) are legitimate. Again, nothing shows up in GWT. We had 96 root domains last March but in March of 2013 we cut most of the paid links and half (20) of the blogs. This brought our ranking down immediately by 2 or 3 slots. We've been slipping every since. I would like people to speak from experience and let me know if you think the backlinks could be causing the ranking drop and what to do about it. Thanks!
White Hat / Black Hat SEO | | BobGW0 -
Google says 404s don't cause ranking drops, but what about a lot of them
Hello, According to Google here, 404s don't cause rankings to go down. Our rankings are going down and we have about 50 or so 404s (though some may have been deindexed by now). We have about 300 main products and 9000 pages in general on this Ecommerce site. There's no link equity gained by 301 redirecting the 404s. A custom 404 page has been made linking to the home page. There's nothing linking to the pages that are 404s Provided that no more 404s are created, can I just ignore them and find the real reason our rankings are going down?
White Hat / Black Hat SEO | | BobGW0 -
Google turned me down, don't know why...
Hello, I'm experiencing decreasing on some of my keywords. I'm aware of some things which could be responsible for it. So I'd like to asi you, if my thoughts are right, and what to do with it. 1. I put backlinks leading onto my website. Those backlinks are on website I also own (they are on the same server). But nothing happened. Than I put other backlikns on this webiste. Those links also led to webistes I own. So could Google "punnished" those websites I'm linking to? 2. I offered my content to another website, which has a higher authority. This content had been published on my website weeks ago, I put it on this (another site). Co could Google punnished me for "duplicate" content? 3. In the past, we outsorced our SEO, and the company which was responsible for our SEO put backlinks leading to our website almost everywhere, I mean, those websites, they put links leading to our webistes fos focused on almost everything but our field (finance). But everything seemed to be fine, till now 4. Couple of days ago, I put our RSS on many RSS agregators and put our webiste on many catalogs. My website URL is www.penizenavic.cz Could you help me out? 🙂 Thanks a lot Petr
White Hat / Black Hat SEO | | petr.rozkosny0 -
Sudden Ranking Drop from 1st Page
My client's Website http://countryfeelingholidays.com is experiencing a huge drop of its rankings since Aug 1st. It was at 2nd on 1st page on google.lk for the keyword Holidays Sri Lanka . But When I checked it last it has gone to 20th page. I really cannot find a reason for this drop . Only thing that comes to mind is that we put a comment on a blog but finally it appeared on all pages because of top commentator plugin . huge rise in backlinks in oneday . from next day we lost its ranking on google.lk but on google.com it is still at the same position where it used to be . What would be the reason for this ? Could it be a penelty ? What should we do now to get its ranking back ?
White Hat / Black Hat SEO | | Osanda0