ECommerce site being "filtered" by last Panda update, ideas and discussion
-
Hello fellow internet go'ers!
Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help.
Before I get into the questions I would like to provide some background:
I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site.
We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google.
Now for some questions:
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Is it a coincidence that it was an exact 30 day "filter"?
Why has only one site recovered?
-
Thanks for your responses.
@EGOL - I would agree that merging the sites would be ideal given that they share such a large database. Unfortunately, this isn't an option for our company (at this point-in-time). Acquiring new content for our product pages has been tossed around, but would be a HUGE undertaking, so its on the "back burner" for the moment.
@Ben Fox - We came to the conclusion that it was content because it was the only clear "offender" on the list of potential problems. However, the fact that only 3 of our sites got penalized perplexes me as well. It would have made more sense had all of our sites suffered a penalty (luckily only 3 did). One response I got from another forum was: since google removed enough duplicate content (3 sites in our case) it deemed that the others were "original".
We didn't point canonicals to any one site (like 9 going to 1). We only added the rel=canonical to our manufacturer category pages (a small percentage of pages). Since some of our domains sell products that aren't "niche specific" we told these pages to send preference to their proper niche domain (hope that made sense).
For discussion purposes, here is a response I got from another forum:
Why has only one site recovered?I suspect/assume the other sites will bounce back the same way after their own 30 day penalties expire.>Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content????? maybe removing the first site allowed the scoring penalty applied to the other sites to shrink in size. as each site was removed, the penalty applied to the others correspondingly shrunk. ?????>Is it a coincidence that it was an exact 30 day "filter"?No. 30 day is a common penalty.Does anyone agree with these? I've heard of the 30 day penalty before. If this is the case, then a warning from Google would be nice.
-
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Google can be slow to detect duplicate content and sometimes tolerates it.
Is it a coincidence that it was an exact 30 day "filter"?
Only google knows.
Why has only one site recovered?
Only google knows.
Google sees a lot of sites with same content and you say that these are "med-large" sites. If I was google I would say... "these are dupe content, we aren't going to index all of them, our searchers don't want to see ten sites with same stuff".
If these were my sites I would merge all of them into one single site. If the content on that site was unique to me I would probably then put all of my efforts into promotion and informative content for the product lines.
If the content was on other sites that I don't own then my efforts would go mainly into making unique product content and informative content for the product lines.
Google has been squashing duplicate content for years. If you have it and you place links between the sites it is very likely that at least one of your sites will be demoted in google or filtered - probably filtered. They don't want to spend their resources indexing ten duplicate sites. They would rather display unique sites to their searchers.
-
How did you decide that it was content causing the issue if only 3/10 of your sites were affected?
Also when you added the rel=canonical did 9 of your sites point to a primary site and was this the site that recovered?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
The evolution of Google's 'Quality' filters - Do thin product pages still need noindex?
I'm hoping that Mozzers can weigh in with any recent experiences with eCommerce SEO..... I like to assume (perhaps incorrectly) that Google's 'Quality' filters (formerly known as Panda) have evolved with some intelligence since Panda first launched and started penalising eCommerce sites for having thin product pages. On this basis i'd expect that the filters are now less heavy handed and know that product pages with no or little product description on them are still a quality user experience for people who want to buy that product. Therefore my question is this...
Algorithm Updates | | QubaSEO
Do thin product pages still need noindex given that more often that not they are a quality search result for those using a product specific search query? Has anyone experienced penalty recently (last 12 months) on an ecommerce site because of a high number of thin product pages?0 -
Recent Google algorithm update?
Two of our clients have experienced a huge dip in organic rankings during the past week or so and we haven't done anything that would cause this. Have there been any major Google changes reported lately? I'm not seeing anything reported here: https://moz.com/google-algorithm-change. Thanks for your input. Eric
Algorithm Updates | | EricFish0 -
Did anyone else notice all their keyword rankings go down after the last Panda refresh on January 17th 2013?
Even before January 17th I noticed my keyword ranking slowly going from the top 3 to around 8, 9 and 10. Then between January 15 and January 30th, (SEO MOZ) is not showing the exact date) they all went down to the second page and worse. The rankings dropped for an e-commerce website petsspark.com. They sell a tear stain removal product which is a pretty competitive market. After January i started to notice that Google was starting to rank blogs, forums, overal product review websites and of course amazon, better than me and my competitors. Was anyone else effected by the panda refresh or have any idea what may have gone wrong? Please help ScreenShot2013-04-10at50852PM.png?t=1365628252
Algorithm Updates | | DTOSI1 -
How could Google define "low quality experience merchants"?
Matt Cutts mentioned at SXSW that Google wants to take into consideration the quality of the experience ecommerce merchants provide and work this into how they rank in SERPs. Here's what he said if you missed it: "We have a potential launch later this year, maybe a little bit sooner, looking at the quality of merchants and whether we can do a better job on that, because we don’t want low quality experience merchants to be ranking in the search results.” My question; how exactly could Google decide if a merchant provides a low and high quality experience? I would image it would be very easy for Google to decide this with merchants in their Trusted Store program. I wonder what other data sets Google could realistically rely upon to make such a judgment. Any ideas or thoughts are appreciated.
Algorithm Updates | | BrianSaxon0 -
Any PR Lose? Google Made a Update ! Heavy Traffic, Followed SEOmoz Tips - Dropped to PR4 ?
I followed the rules to minimize the links in the page. Getting Same Traffic to my blog and increased only. But my PR5 to PR4 ? why even 404 Error was reduced o 5 or 6 which i updated now ! not accepting any Text Link ads ! too past 6 months also !
Algorithm Updates | | Esaky0 -
Panda / Penguin Behavior ? Recovery?
Our site took a major fall on March 23rd, ie Panda 3.4 and then another smaller one on April 24th, ie Penguin. I have posted a few times in here trying get help on what items to focus on. Been doing this for 13 years, white hat, never chased algos but of course learned as I went. As soon as the fall hit one expert said it was links, which I kinda doubted because we never went after them but we have some but only a handful in comparison to really good authorative links. I concentrated on cleaning up duplicate content due to tags in a blog that only had 7 posts (an add on section to the site) then focuses efforts on just going through and making content better. Had other overlapping content that I would guess would pass inspection but I cleaned it up. After 6 weeks no movement back up, another expert here said yes, he saw some bad links so I should check it out. So back to focusing on links, I actually run a report and discover questionable links, and successfully get about 25 removed. Low numbers but we have only about 50 that were questionable. No contact info on the other directories so I guess we are stuck. Here is where I just go in circles... When our site fell on March 23rd we had 13 of our main pages still ranking at number 1 and 2 on each keyword phrase. Penguin hit and they fell about 10 spots. EXCEPT, one... This one keyword phrase and page stayed on top and ranked at #1 throught he storm. (finally fell to #4 but still remains up there). The whole site is down 90%, we only have 3 fair keyword phrases really ranking out of 250. The mystery is that the keyword phrase that was ranking was the one that supposedly had way over the % of anchor text, 7% of our links go to that page. The other pages that fell on Penguin had no pages linking back. I have been adding blog posts to our site, I post one an in a few days it gets indexed, have one of those ranking at #2 for the keyword, moved up from #4 a week after posting it in the blog. (google searches shows 80K) Just seems like the site should bounce back if new content is able to rank, why not the old? Did other people hit by Panda and Penguin see a sitewide fall or are they still ranking for some terms? I would love to see some discusson on success stories of bouncing back after Panda and Penguin. I see the WP success story but that was pretty sudden after it was brought to Google's attention. Looking for that small business that fixed something and saw improvement. Give me hope here please.
Algorithm Updates | | Force70 -
Data on Google Vs Bing, et al and changes to sites.
I am curious to know if anyone has any data that correlates site/page changes like content or Title Tag, H1, etc. and subsequent movement in rankings on Google and Bing and Yahoo? The equation is for example: ABCSite.com/home-page/ makes a change to the H1 and H2 and one paragraph of content is changed. Over next 6 to 12 weeks changes in page rank for the 3 engines is tracked to see where it started and where it "stopped." Obviously, there are more factors than individual algorithms in play here. An example of that would be that a significant number of sites will be indexed in Google by a dev and not in the others. We see this regularly. So, at least from a timing standpoint, different sites are entering/leaving the fray at different rates. We are going to begin to track this but I would love to see any data already around or speak with anyone involved in such a study about what they found. Thanks
Algorithm Updates | | RobertFisher0