Massive drop in Google traffic after upping pagecount 8-fold.
-
I run a book recommendation site -- Flashlight Worthy.
It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage".
It's been online for 4+ years.
Historically, it's been made up of:
-
a single home page
-
~50 "category" pages, and
-
~425 "book list" pages.
(That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.)
On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight.
If an Author has more than one book on the site, the page shows every book they have on the site, such as this page:
http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805
..but the vast majority of these author pages have just one book listed, such as this page:
http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116
Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries.
And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google.
(Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...)
Here's the problem:
For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable.
And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today.
And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem...
So:
1. Do you think the drop is related to my upping my pagecount 8-fold overnight?
2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority).
3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors.
What else?
Thanks so much, help is very appreciated.
Peter
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. -
-
Thanks for updating on your findings. That is interesting, but glad you got it sorted.
-
And now another update. About 1 week after removing all the new content, search traffic came right back to where it was. So clearly Google was mad at me. And now they're not. Sigh. Stupid Google.
-
UPDATE: I've removed all the new pages from my site in hopes that it will turn around my losss is search traffic. I'd still like an expert opinion on the matter in general.
-
Indeed, I looked at Webmaster Tools -- no duplicates.
As far as Canonical, while I know and love that feature, I don't think it's relevant here. These pages aren't different URLs for the same content -- they're segments of content taken from different pages, stitched together in a new and useful way.
I think, if this is the problem, that it's the fact that 95% of the new pages only have 1 item of content on them and it's a piece of content that appears elsewhere on the site.
-
Hi Peter
I agree Matt Cutts wasn't very clear as providing a solid number, but I actually consider what he said about relativity. "..if your site was 1 day .. um you know nothing, then the next day there is 4 million pages in our index" seems to me like he was hinting a percentage rather then a hard number. In your case you increased your site by over a 1000% with no new content.
From a useability standpoint it maybe awesome, from an SEO standpoint it may not. I can't say for sure the best way to handle it, but if it was me I would not throw away the benefit to my users, I instead would look to see if I can canonicalize any of these pages to prevent lower the burden on Google to try and differentiate one page from another.
Have looked at your Google Webmaster Tools to see if they are seeing some pages as duplicates?
-
Don, thatnks for replying. In answer to your questions:
-- Yes we added all the pages to the sitemap.
--As far as the content being unique, no -- not one word on any of the pages is unique. But the aggregation of the information onto those pages is unique and helpful to the end user. For example, say you had a site full of movies that won Oscars -- winners of 2010, all movies that won Best Director, all movies that won best Music, etc. Now imagine you'd like to see all the Tom Hanks movies that have won Oscars. There are a number of Tom Hanks movies scattered across the lists but there's no easy way to see them all at once. So generating a list of Tom Hanks movies that won Oscars is easy and useful. Only problem is, about 95% of the time when you generate such lists, you'll generate them for actors that were only in 1 Oscar-winning movie... hence a bunch of pages that are of little use. But why would that hurt traffic to all the pages that HAVE been of use for the last several years?
That Matt Cutts video was interesting... but I'm not sure if there's a clear answer there. he said 100+ pages at once is fine. But 10,000... maybe not. So what about 4,500?
-
Hi Peter,
According to Matt Cutts as long as the content is quality / good / unique you should not have been dinged.
You watch his answer to a very similar question on youtube here.
Now what is interesting is you went from 500 pages to 4000 pages. That is a huge update in terms of what your site has been offering so there maybe something going on there.
Did you submit all these page in a sitemap to Google? and by nature of these pages was the content unique or snippets of the inner content?
I will add a story about our how I handled a similar situation and maybe give you something to ponder. We have an o-ring size look up section on our site, the urls being generated are dynamic and number in the thousands, due to the combination of sizes, materials, and hardness. I did not tell Google about these links in the sitemap, rather just put a link to 8 main materials in the sitemap and then let Google discover the dynamic urls on their own.
After 6 months I noticed that Google was actually treating many of the deep pages as duplicate content, so I used rel='canonical" to direct the juice to the top material pages. Our traffic and SERP ratings went up for these pages.
I tell that to illustrate what I learned, having more pages isn't always good, in my case a nitrile as568-001 oring page isn't that different from a nitrile as568-002 oring page, and while they are certainly different sizes you can find information on either one from the nitrile as568 page. The smart thing I did was not flooding Google with thousands of new pages, the dumb thing I did was not canonicalizing the deep pages to begin with.
I will be interested in what others have to say on this subject, and I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google says Geolocation Redirects Are Okay - is this really ok ?
Our aim is to send a user from https://abc.com/en/us to** https://abc..com/en/uk/ **if they came to our US English site from the UK So we came across this document - https://webmasters.googleblog.com/2014/05/creating-right-homepage-for-your.html We are planning to follow this in our international website based on the article by google : automatically serve the appropriate HTML content to your users depending on their location and language settings. You will either do that by using server-side 302 redirects or by dynamically serving the right HTML content. Will there be any ranking issues/ penalty issue because of following this or because of 302 redirects ? **Another article - **https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
White Hat / Black Hat SEO | | NortonSupportSEO0 -
Trying to escape from Google algorithm ranking drop
in 2010 our website was ranking number 1 for many keywords. we suddenly saw a crash in this a few years ago. we have since identified we have been hit by many shades of Panda and penguin updates. Mainly due to low quality back-links and poor content (some duplicates). since then we have done a major overhaul of our backlink profile. We have saved rankings that went from number 1 for many keywords to number 60 -70. We are now placed at around 11 to 18 rankings. We have also looked at our duplicate content issues, and removed all duplicate content, introduced a blog for fresh bi daily updates in an attempt to gain traffic. We also amalgamated many small low quality pages to larger higher quality content pages. we are now mobile friendly with a dynamic site, and our site speed is good (around 80). we have switched to https, and also upgraded our website for better conversions. we have looked at the technical issues of the site and don't have many major issues, although we do have 404's coming up in the google webmaster tools for old pages we removed due to duplicate content. we are link building at a pace of around 40 mentions a month. some are no follow, some do follow and some no links. We are diversifying links to include branding in addition to target keywords. We have pretty much exhausted every avenue we can think of now, but we cannot jump over to page 1 for any significant keywords we are targeting. Our competitor websites are not that powerful, and metrics are similar to ours if not lower. 1. please can you advise anything else you can think of that we should look at. 2. we are even considering going to a new domain and 301'ing all pages to this domain in an attempt to shake off the algorithm filter (penalties). has anyone done this? how long can we expect to get at least the same ranking for the new domain if 301 all urls to it? do you think its worth it? we know the risk of doing this, and so wanted to seek some advice. 3. we have on the other hand considered the fact that we have disavowed so many links (70%) that this could be a cause of the page two problem, however we are link building according to moz metric standards and majestic standards with no benefit.. do you think we should increase link building? Advice is appreciated!
White Hat / Black Hat SEO | | Direct_Ram0 -
Black hat : raising CTR to have better rank in Google
We all know that Google uses click-through-rate (CTR) as one of it is ranking factor. I came up with an idea in my mind. I would like to see if someone saw this idea before or tried it. If you search in Google for the term "SEO" for example. You will see the moz.com website in rank 3. And if you checked the source code you will see that result 3 is linking to this url: https://www.google.com.sa/url?sa=t&rct=j&q=&esrc=s&source=web&cd=3&cad=rja&uact=8&ved=0CDMQFjAC&url=https%3A%2F%2Fmoz.com%2Fbeginners-guide-to-seo&ei=F-pPVaDZBoSp7Abo_IDYAg&usg=AFQjCNEwiTCgNNNWInUJNibqiJCnlqcYtw That url will redirect you to seomoz.com Ok, what if we use linkbucks.com or any other cheap targeted traffic network and have a campaign that sends traffic to the url that I show you. Will that count as traffic from Google so it will increase the CTR from Google?
White Hat / Black Hat SEO | | Mohtaref11 -
Google is giving one of my competitors a quasi page 1 monopoly, how can I complain?
Hi, When you search for "business plan software" on google.co.uk, 7 of the 11 first results are results from 1 company selling 2 products, see below: #1. Government site (related to "business plan" but not to "business plan software")
White Hat / Black Hat SEO | | tbps
#2. Product 1 from Palo Alto Software (livePlan)
#3. bplan.co.uk: content site of Palo Alto Software (relevant to "business plan" but only relevant to "business plan software" because it is featuring and linking to their Product 1 and Product 2 sites)
#4. Same site as #3 but different url
#5. Palo Alto Software Product 2 (Business Plan Pro) page on Palo Alto Software .co.uk corporate site
#6. Same result as #5 but different url (the features page)
#7. Palo Alto Software Product 2 (Business Plan Pro) local site
#8, #9 and #10 are ok
#11. Same as #3 but the .com version instead of the .co.uk This seems wrong to me as it creates an illusion of choice for the customer (especially because they use different sites) whereas in reality the results are showcasing only 2 products. Only 1 of Palo Alto Software's competitors is present on page 1 of the search results (the rest of them are on page 2 and page 3). Did some of you experience a similar issue in a different sector? What would be the best way to point it out to Google? Thanks in advance Guillaume0 -
Does this look like a Penguin drop to you?
Hi Folks, This is my first post here. Psyched to be part of this great community. I have a site that's seen a steady drop in Google organic traffic since September of last year. Slow at first, then picking up speed in late January, then in a free-fall in May. Things are finally flattening out, but I'm left with 30% of my former traffic. See graph. I've been thinking that this was caused by Penguin. Back in 2006-2009, I used free directory submission services, and it looked like I was finally getting penalized for it. However, from the research I've done so far, it looks like websites hit by Penguin see a decrease in traffic over a couple days, not six months. Should I concern myself with disavowing those spammy directory links, or focus my energy elsewhere? There are other plausible explanations for the decline. I haven't posted much content on the site in recent years, and have let my blog go fallow. Obviously, this needs to be fixed. My question is, in addition to my content development and quality linkbuilding efforts, should I be worried about those spammy links? For the record, this is a high-quality informational site with lots of high-quality links mixed in with the spammy ones. Thanks for any insight you can offer. qozm7Rr.png
White Hat / Black Hat SEO | | srmaximo0 -
Rankings dropped, should I start a new website?
Hello, my rankings dropped last year (penguin update) - I think it was April 2012 and the website went from 300 visitors per day to 10 per day. This probably happened because I bought links, but I also did a lot of manual and natural SEO (at that time). After the drop, I didn't know what to do... so I did some manual SEO, blog comments, forum posts, article publications (lets say 60 links in total - with diverse anchor texts - brand keywords, etc) and then I paused working on the site to see if there will be any changes... and 1 year latter, there are still no changes. My site used to be in the top results of the first page and now it is totally out of Google. http://getmoreyoutubeviews.com Should I move on and start a new website or do something to fix this one? Thanks Alex
White Hat / Black Hat SEO | | buysocialexposure0 -
Big loss in Google traffic recently, but can't work out what the problem is
Since about May 17 my site - http://lowcostmarketingstrategies.com - has suffered a big drop in traffic from Google, presumed from the dreaded Penguin update. I am at a loss why I have been hit when I don't engage in any black hat SEO tactics or do any link building. The site is high quality, provides a good experience for the user and I make sure that all of the content is unique and not published elsewhere. The common checklist of potential problems from Penguin (such as keyword stuffing, web spam and over optimisation in general) don't seem relevant to my site. I'm wondering if someone could take a quick look at my site to see any obvious things that need to be removed to get back in Google's good books. I was receiving around 200 - 250 hits per day, but that has now dropped down to 50 - 100 and I fee that I have been penalised incorrectly. Any input would be fantastic Thanks 🙂
White Hat / Black Hat SEO | | ScottDudley0 -
Has anyone seen this kind of google cache spam before?
Has anyone seen this kind of 'hack'? When looking at a site recently I found the Google cache version (from 28 Oct) strewn with mentions of all sorts of dodgy looking pharma products but the site itself looked fine. The site itself is www.istc.org.uk Looking in the source of the pages you can see the home pages contains: Browsing as googlebot showed me an empty page (though msnbot etc. returned a 'normal' non-pharma page). As a mildly amusing aside - when I tried to tell the istc about this, the person answering the phone clearly didn't believe me and couldn't get me off the line fast enough! Needless to say they haven't fixed it a week after being told.
White Hat / Black Hat SEO | | JaspalX0