Massive drop in Google traffic after upping pagecount 8-fold.
-
I run a book recommendation site -- Flashlight Worthy.
It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage".
It's been online for 4+ years.
Historically, it's been made up of:
-
a single home page
-
~50 "category" pages, and
-
~425 "book list" pages.
(That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.)
On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight.
If an Author has more than one book on the site, the page shows every book they have on the site, such as this page:
http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805
..but the vast majority of these author pages have just one book listed, such as this page:
http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116
Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries.
And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google.
(Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...)
Here's the problem:
For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable.
And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today.
And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem...
So:
1. Do you think the drop is related to my upping my pagecount 8-fold overnight?
2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority).
3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors.
What else?
Thanks so much, help is very appreciated.
Peter
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. -
-
Thanks for updating on your findings. That is interesting, but glad you got it sorted.
-
And now another update. About 1 week after removing all the new content, search traffic came right back to where it was. So clearly Google was mad at me. And now they're not. Sigh. Stupid Google.
-
UPDATE: I've removed all the new pages from my site in hopes that it will turn around my losss is search traffic. I'd still like an expert opinion on the matter in general.
-
Indeed, I looked at Webmaster Tools -- no duplicates.
As far as Canonical, while I know and love that feature, I don't think it's relevant here. These pages aren't different URLs for the same content -- they're segments of content taken from different pages, stitched together in a new and useful way.
I think, if this is the problem, that it's the fact that 95% of the new pages only have 1 item of content on them and it's a piece of content that appears elsewhere on the site.
-
Hi Peter
I agree Matt Cutts wasn't very clear as providing a solid number, but I actually consider what he said about relativity. "..if your site was 1 day .. um you know nothing, then the next day there is 4 million pages in our index" seems to me like he was hinting a percentage rather then a hard number. In your case you increased your site by over a 1000% with no new content.
From a useability standpoint it maybe awesome, from an SEO standpoint it may not. I can't say for sure the best way to handle it, but if it was me I would not throw away the benefit to my users, I instead would look to see if I can canonicalize any of these pages to prevent lower the burden on Google to try and differentiate one page from another.
Have looked at your Google Webmaster Tools to see if they are seeing some pages as duplicates?
-
Don, thatnks for replying. In answer to your questions:
-- Yes we added all the pages to the sitemap.
--As far as the content being unique, no -- not one word on any of the pages is unique. But the aggregation of the information onto those pages is unique and helpful to the end user. For example, say you had a site full of movies that won Oscars -- winners of 2010, all movies that won Best Director, all movies that won best Music, etc. Now imagine you'd like to see all the Tom Hanks movies that have won Oscars. There are a number of Tom Hanks movies scattered across the lists but there's no easy way to see them all at once. So generating a list of Tom Hanks movies that won Oscars is easy and useful. Only problem is, about 95% of the time when you generate such lists, you'll generate them for actors that were only in 1 Oscar-winning movie... hence a bunch of pages that are of little use. But why would that hurt traffic to all the pages that HAVE been of use for the last several years?
That Matt Cutts video was interesting... but I'm not sure if there's a clear answer there. he said 100+ pages at once is fine. But 10,000... maybe not. So what about 4,500?
-
Hi Peter,
According to Matt Cutts as long as the content is quality / good / unique you should not have been dinged.
You watch his answer to a very similar question on youtube here.
Now what is interesting is you went from 500 pages to 4000 pages. That is a huge update in terms of what your site has been offering so there maybe something going on there.
Did you submit all these page in a sitemap to Google? and by nature of these pages was the content unique or snippets of the inner content?
I will add a story about our how I handled a similar situation and maybe give you something to ponder. We have an o-ring size look up section on our site, the urls being generated are dynamic and number in the thousands, due to the combination of sizes, materials, and hardness. I did not tell Google about these links in the sitemap, rather just put a link to 8 main materials in the sitemap and then let Google discover the dynamic urls on their own.
After 6 months I noticed that Google was actually treating many of the deep pages as duplicate content, so I used rel='canonical" to direct the juice to the top material pages. Our traffic and SERP ratings went up for these pages.
I tell that to illustrate what I learned, having more pages isn't always good, in my case a nitrile as568-001 oring page isn't that different from a nitrile as568-002 oring page, and while they are certainly different sizes you can find information on either one from the nitrile as568 page. The smart thing I did was not flooding Google with thousands of new pages, the dumb thing I did was not canonicalizing the deep pages to begin with.
I will be interested in what others have to say on this subject, and I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google spider
If someone provide 1 or more cent discount to our customers who put up a link on their site, and wanted to actually show the referral discount in their shopping cart for that customer, can Google see that and realize they are providing a discount for a link? Can Google see what's displayed in our their web application - like in the upload, shopping cart and complete transaction pages?
White Hat / Black Hat SEO | | K_Monestel0 -
Page keeps dropping
Hello All, I have a page (http://www.caravanguard.co.uk/touring-caravan-insurance/) we have seen it drop a bit over recent months and now we have been appearing low on page two (Google UK) for the term "Caravan Insurance". Now I do not expect to rank above some of the big players in our market and the aggregators, but we do feel we have good content and have tried to act as white as we can in terms of SEO, link building and alike, but we are still slipping. My main issue is this - one minute we have the link appearing in the serps and then the next it drops from the serps altogether. At this point i often resubmit for indexing in webmaster tools and a few minutes later its back.... is this just likley something I am seeing local to me due search becoming more personal (I have toggle hide private results too) or is there a cause to its sudden disappearance and reappearance? Any insight would be greatly appreciated.
White Hat / Black Hat SEO | | TimHolmes0 -
Rank drop ecommerce site
Hello, We're going to get an audit, but I would like to hear some ideas on what could cause our ranking drop. There's no warnings in GWT. We deleted 17 or so blogs (that had no backlinks pointing to these blogs and were simply for easy links) last summer thinking that they weren't white hat so we had to start eliminating them. At the same time, we eliminated a few sitewide paid links that were really strong. With all of this deletion, our keywords started to drop. For example, our main keyword went from first to third/fourth. With the deletions, our keywords dropped immediately a couple of spots, then with no more deletions, all of our keywords have been slowly dropping over the last seven months or so. Right now we are at the bottom of the first page for that same main keyword, and other keywords look similar. We have 70 linking root domains, of which: 15 are blogs with no backlinks that were created simply for the purpose of easy links. We didn't delete them all yet because of the immediate ranking drop when we deleted the last ones. One PR5 site has links to our home page scattered throughout it's lists of resources for people in different states in the US. It doesn't look like a standard paid link site, but it has many paid links in it's different pages. One PR4 site has our logo with another paid link logo at the bottom of one of it's pages. There are 2 other paid links from two PR4 sites that look editorial. There are other links on the sites to other websites that are paid. All links for these 2 sites look editorial. That's all the bad stuff. Other things that could be causing drop in rank - > Our bread crumbs are kind of messed up. We have a lot of subcategory pages that rel=cononical to main categories in the menu. We did this because we had categories that were exactly the same. So you'll drill down on a category page and you'll end up on a main category. To the average user, it seems perfectly fine. Our on-site SEO still has a few pages that repeat words in the titles and h1 tags several times (especially our #1 main keyword), titles similar to something like: running shoes | walking shoes | cross-training shoes where a word is repeated 2 or 3 times. Also, there are a few pages that are more keyword stuffed than we would like in the content. Just a couple of paragraphs but 2 keywords are dispersed in them three times each. The keywords in this content is not in different variations, it's exactly the keyword. We've still got a few URLs that are keywords stuffed with like 3 different keywords. We may have many 404 errors (due to some mistakes we made with the URLs in our cart) - if Google hasn't deindexed them all then we could have dozens of 404s on important category pages. But nothing is showing up in GWT. Our sitemap does not include any broken links. Google is confused about our branding it seems. I'm adding branding to the on-site SEO but right now Google often shows keywords as our branding when Google changes the way the title tag is displayed sometimes in the search engines. We don't link out to anyone. We have lots of content, almost no duplicate content, and some authoritative very comprehensive articles. Your thoughts on what to do to get our rankings back up?
White Hat / Black Hat SEO | | BobGW0 -
How does the Google Treat 301 Redirects?
Hi, My website was one of many that dropped in rankings this last Friday, The company that i outsourced my SEO 4 months ago did a bad job. Now i'm doing everything my self to recover, so i was thinking getting a new hosting, duplicate the website with a same content (i have original quality content) and 301 my old domain to new one? How long can it last with Google? Can penalties be passed via 301 redirects ? Looking forward to your help.
White Hat / Black Hat SEO | | mezozcorp0 -
I think my site is affected by a Google glitch...or something
Although google told me No manual spam actions found i had not received an unnatural link request notice i figured it would be a good idea to clean these up so i did. So i have submitted 3 reconsideration requests from google. They all came back with the same response: No manual spam actions found. I really doubt that anyone at google really checked those out.You will notice that i don't even appear on page 1-10 at all...its clearly google filtering the site out from the results(except for my brand terms), but i have no idea what for.What do you guys think it is? If you see anythign let me know so i can have it fixed.This has been going on for 2 months now...my company has been around for a long time...i dont understand why suddenly im not showing up in searches for the keyword si used to rank for...
White Hat / Black Hat SEO | | CMTM0 -
Is it outside of Google's search quality guidelines to use rel=author on the homepage?
I have recently seen a few competitors using rel=author to markup their homepage. I don't want to follow suit if it is outside of Google's search quality guidelines. But I've seen very little on this topic, so any advice would be helpful. Thanks!
White Hat / Black Hat SEO | | smilingbunny0 -
Someone COPIED my entire site on Google- what should I do?
I purchased a very high ranked and old site a year or so ago. Now it appears that the people I purchased from completely copied the site all graphics and content. They have now built that site up high in rankings and I dont want it to compromise my site. These sites look like mirror images of each other What can I do?
White Hat / Black Hat SEO | | TBKO0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ...
White Hat / Black Hat SEO | | GerryWeitz<cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ...
<cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
0