ECommerce site being "filtered" by last Panda update, ideas and discussion
-
Hello fellow internet go'ers!
Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help.
Before I get into the questions I would like to provide some background:
I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site.
We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google.
Now for some questions:
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Is it a coincidence that it was an exact 30 day "filter"?
Why has only one site recovered?
-
Thanks for your responses.
@EGOL - I would agree that merging the sites would be ideal given that they share such a large database. Unfortunately, this isn't an option for our company (at this point-in-time). Acquiring new content for our product pages has been tossed around, but would be a HUGE undertaking, so its on the "back burner" for the moment.
@Ben Fox - We came to the conclusion that it was content because it was the only clear "offender" on the list of potential problems. However, the fact that only 3 of our sites got penalized perplexes me as well. It would have made more sense had all of our sites suffered a penalty (luckily only 3 did). One response I got from another forum was: since google removed enough duplicate content (3 sites in our case) it deemed that the others were "original".
We didn't point canonicals to any one site (like 9 going to 1). We only added the rel=canonical to our manufacturer category pages (a small percentage of pages). Since some of our domains sell products that aren't "niche specific" we told these pages to send preference to their proper niche domain (hope that made sense).
For discussion purposes, here is a response I got from another forum:
Why has only one site recovered?I suspect/assume the other sites will bounce back the same way after their own 30 day penalties expire.>Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content????? maybe removing the first site allowed the scoring penalty applied to the other sites to shrink in size. as each site was removed, the penalty applied to the others correspondingly shrunk. ?????>Is it a coincidence that it was an exact 30 day "filter"?No. 30 day is a common penalty.Does anyone agree with these? I've heard of the 30 day penalty before. If this is the case, then a warning from Google would be nice.
-
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Google can be slow to detect duplicate content and sometimes tolerates it.
Is it a coincidence that it was an exact 30 day "filter"?
Only google knows.
Why has only one site recovered?
Only google knows.
Google sees a lot of sites with same content and you say that these are "med-large" sites. If I was google I would say... "these are dupe content, we aren't going to index all of them, our searchers don't want to see ten sites with same stuff".
If these were my sites I would merge all of them into one single site. If the content on that site was unique to me I would probably then put all of my efforts into promotion and informative content for the product lines.
If the content was on other sites that I don't own then my efforts would go mainly into making unique product content and informative content for the product lines.
Google has been squashing duplicate content for years. If you have it and you place links between the sites it is very likely that at least one of your sites will be demoted in google or filtered - probably filtered. They don't want to spend their resources indexing ten duplicate sites. They would rather display unique sites to their searchers.
-
How did you decide that it was content causing the issue if only 3/10 of your sites were affected?
Also when you added the rel=canonical did 9 of your sites point to a primary site and was this the site that recovered?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do we take a SEO hit for having multiple URLs on an infinite scroll page vs a site with many pages/URLs. If we do take a hit, quantify the hit we would suffer.
We are redesigning a preschool website which has over 100 pages. We are looking at 2 options and want to make sure we meet the best user experience and SEO. Option 1 is to condense the site into perhaps 10 pages and window shade the content. For instance, on the curriculum page there would be an overview and each age group program would open via window shade. Option 2 is to have an overview and then each age program links to its own page. Do we lose out on SEO if there are not unique URLS? Or is there a way using metatags or other programming to have the same effect?
Algorithm Updates | | jgodwin0 -
Penguin 3.0 Site Dropped after Update
Hi We was hit by the Penguin update a long time ago and we lost a lot of traffic/positions because of this. For a long time we worked really hard to identify all off our links that may have caused us to recieve this penalty. After Months of work we submitted the disavow file and reconsideration request and in June 2014 we recieved confirmation from google in webmaster tools that the manual spam action had been revoked. over time we then started to recieve more traffic and better positions in the serps, however since penguin 3.0 we have dropped again for a range of keywords. many going from page 1 to 2 or page 2 to 3/4 Any ideas what we should do here , any help will be really appriciated as I'm totally confused We havent done any link building at all since the penalty / recovery
Algorithm Updates | | AMG1000 -
Authorship Photo Not showing in for last 6 months now
Hi - Early last year we activated Authorship Photo which remained in SERP till Nov'12... However, post that the Authorship Photo is not showing, neither the schema rating tags are showing in Search engines We tried a lot by changing lot of hits and tries, still to no avail.. We have written to google twice for it (they had a link for which author whose photo now showing after everything is right can send site links )- but to no avail.... The Rich snippet shows Authorship Photo, Schema rating. Even, Google custom search on our own site own page showing it... However, this does not translated into any of these shown in actual Google Search Engines. Our Site is :- http://www.mycarhelpline.com/ Sample Links of rich snippet http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.mycarhelpline.com%2Findex.php%3Foption%3Dcom_easyblog%26view%3Dentry%26id%3D94%26Itemid%3D91&html= http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.mycarhelpline.com%2Findex.php%3Foption%3Dcom_latestnews%26view%3Ddetail%26n_id%3D467%26Itemid%3D10&html= Even google custom search showing schema rating tags for searched keywords like :- Ford Ecosport, Tata Nano Diesel .... However, on actual search the schema tags are now shown Can anyone suggest - what am i missing, actually lost on this...... Worse - our SERP are somehow also slowly coming down for some of the main keywords too
Algorithm Updates | | Modi0 -
Search Engines Traffic for New Site?
Hi, Can anyone tell me please when a new website starts receiving traffic from the search engines? Regards
Algorithm Updates | | kywebsol0 -
Considering the Panda algorithm updates, would you recommend reducing high amounts of inbound links from a single website?
My website has a significant number of inbound links (1,000+) from a single website, due to a sponsorship level contribution. Both my website and the other are authorities in the industry and in search results (PR of 5). Since even ethical websites can suffer a penalty from each iteration of Panda, I'm considering significantly removing the number of links from this website. Do you think that measurable change would be seen favorably by Google or would the drop in links be detrimental?
Algorithm Updates | | steelintheair0 -
Our Developer Site randomly drops 10+ places in Google searches for our Company Name. Why?
Hey everyone, At Betable, we have a player-facing site and a developer-facing site. We also have a developer-facing blog. We have this issue where our developer-facing site will randomly drop 10+ places in Google's Search results for the keyword "betable". This problem can be reproduced by others and in incognito mode, so it's not just one person's results. Furthermore, the developer-facing blog and our social media accounts all suddenly rank higher than the developer site. Even stranger, this problem randomly fixes itself after a few days. This has happened twice so far, and on each occasion there were no changes to the website that would have prompted a drop in rank. After the first drop, we did our best to neutralize any SEOMoz "red alerts" but to no avail, the drop happened again last week. Can someone help us understand what's going on? Are there ways to avoid this? Thanks, Tyler
Algorithm Updates | | Betable0 -
Did we get hit by Panda? What do we do?
Hello, here's our site: nlpca(dot)com We had a big drop in rankings, going from about 19th to about 43rd for our main keyword and having significant drops in other keywords. This happened roughly 6 weeks ago We thought it was being caused by either: Placing keywords in titles before we had them in the content. or Trying to rank for Utah keywords - we're the NLP Institute of California and we are in both places now, but the site talks about mainly California. We changed both these things, and we're still at the low rankings. Will we move back up? What do we do? Will a backlink campaign be effective at this point?
Algorithm Updates | | BobGW0 -
Panda Update: Need your expertise...
Hi all, After Panda update our website lost about 45% of it's traffic from Google. It wasn't an instant drop mostly it happened gradually over the last 5 months. Our keywords (all of them except the domain name) started to lose positions from top #10 to now 40+ and all recovery attempts we have done so far didn't really help. At this moment it would be great to get some advice from the top experts like you here. What we have done so far is that We have gone through the all pages and removed the duplicate / redundant ones. We have refresh the content on the main pages and also all pages now have an canonical tags. Our website is www.PrintCountry.com. Thank you very much in advance for your time.
Algorithm Updates | | gbssinc0