Panda recovery. Is it possible ?
-
Dear all,
To begin, english is not my native language, so I'm very sorry if I make some mistake.
On the 23th march 2012, Panda penalized my website (a coupon website) Avec-Reduction (dot com for the url). At this date, I have lost more than 70% of my traffic.
The structure of the website was like an e-commerce website. Categories -> merchant page -> coupon page. The content was to thin for Google, I'm agree wit that.
So, in may, I have made a new version. Here you can see the most important modifications :
- A smallest header (-100px height).
- 2 columns website (the oldest website had 3 columns)
- I have deleted the category menu with the list of all categories and the alphabetical menu.
- less ads on the website (since few days I have also deleted the 2 adense blocks)
- The coupons was promoted with the thumbs of the merchant on the home listing. Now I have few top lists in text only.
- I have deleted all the categories pages (one page by category of merchant, with the listing of all the merchants of the category). Now I have only one page for this. Same thing with the alphabetical pages. All these deleted pages have a 301 redirect. The 2 new pages (categorie page and alphabetical page) are on noindex.
- I have deleted all the promo codes pages, all the coupons are now on the merchant page (301 redirect used).
- I have create an anti-spam system for the review codes (I have a lot of spam on these forms, even if I cleaned every day/2days). Now, I have no spam.
- Visitors have now the possibility to put a note and a review for each merchant. This fonctionality is new, so not a lot of reviews for the moment.
- All the merchants pages without promo codes have a noindex on the robot tag.
- Since july, I have the possibility to use the "title" of each promo code. I can personnalised the promo code. On the same time, to have more content, I have the possibility to add sales or great promos for each merchants, not only promo codes.
- Affiliate links are created on JS which open a new window (a redirect page with noindex).
That's the end of the most important changes on my website. I have a better speed page (/2 since july) because I have optimized my images, CSS, JS...
At the end of july, I had health problem and the website has no update until the first days of october. Now, the website is updated every day, but between july and october I have no panda recovery.
I have no duplicate content, I try to add the most content I can. So I don't understand why Google Panda penalized me again and again. Some of my competitors have a lot of keyword stuffing (4, 5, 6, ... 10 lines of KS on each merchant pages). Some of them have only affiliate merchants, automatic script to put coupons on websites), few "same" websites...
I have less than 30% of affiliated merchant, I validate all the coupons or promo manually, I personalized all my coupons... So I don't understand what to do.
I will appreciate all help. If you see problems on my wesite or if you know tips to have a panda recovery, I will be very happy to have informations.
Many thanks for all.
Sincerely,
Florent
-
Dear Edward,
You are right. I have seen some US/UK coupons websites with informations about the company or the website (payment methods, shipping methods...). I think it's a good idea to have better content. On France, there are not a lot of coupons websites with these types of informations. The market on our country is important but less thant country like USA. So, a lot of websites are made automaticly, in few days some webmasters have few coupons website and they win money because they have a network with power.
I think I will use this option on the near future. 2600 merchants, it will be long to add these informations but if it's possible to have a Panda Recovery, I think it's not a hard work, just necessary.
Thx for your help.
Sincerely,
F.
-
I've worked with a few Coupon/Promo Codes sites since the launch of Penguin with some success and some failure. The biggest issues I've found between coupon sites is a lack of truly original content and very thin content, with pages frequently saying the same thing as other pages but slightly re-worded. Duplicate content issues are usually common as well.
Ex: "Check out our coupon codes for [company/website]...[more filler text here]."
One strategy that seems to be fairly effective (and logical) for such sites is filling the retailer coupon pages with information relevant to the coupons (which obviously vary) as well as the company. Ex: company history, background, etc. -- content that's truly unique from page to page.
-
Dear eyepaq,
Many thanks for your reply, that's great.
Like you say, I'm sure that my change are good for the future so when the panda filter (sorry for having used the word penality, in France we used the word "filtre", it's difficult to speak in other language :p).
you don't have to be good to stay safe or recover from panda, you need to be better then the rest of the sites that are covering the same niche.
I'm ok with that, that's the only point on which I have no idea to be better. I have see all the most important US coupons websites to help me but they are to big for me. Technically they are better, better design and I think they have a lot of persons which work every day for the website. In France, there are less competitors, 5 biggest and all the other are very simple website like mine.
+ panda it's only about the content (site speed, affiliate format, links etc are not taking into account - for those you have other filters and penalties)
I know, but havind a better speed is good for visitor. I think it's good for Google to show that speed page is important for the website.
Spot the ones that are still up and try to improve the content in order to be better then those are have a different approach for the same niche / area in order to get a spot for the diverse spots. In this way you will get back.
I will work on this :).
First assess the situation and see if you can be better - in terms of content - then those that are still on top in visibility that are targeting more or less the same content as you do. If you can beat them - change your content strategy and approach the same content in a different format and flow in order to make it on the top results as one of the sites that are part of the diversity results.
Ok but it's very difficult. I use 2 websites on my niche that have better traffic than. Why these 2 websites ? Only because they have a simple design and not a lot of options on the merchant pages. These 2 websites have problems which I haven't got but no pand filter and better traffic. The reason ? They are oldest than me and they have a lot of links (one have more 1 million link). So it's not very clean but they rank well.
Just a last question. Do you think it's better for me to "crypt" the coupon code (on the html code) ? Why ? Just because Google can see that we have all the same code. If I use a crypt code, perhaps it will be better to say "I have unique content" ? Do you think that it'a a good idea ?
Once again many thanks for your post. You are very clear for me and you give me another vision to solve my problem :).
Sincerely,
Florent
-
Hi Florent,
All the changes that you did are very good and for sure it will help your site - but not necessarily on the Panda issue.
When talking about panda you need to consider a few things:
-
panda is not a penalty - it's a filter (very important difference)
-
you don't have to be good to stay safe or recover from panda, you need to be better then the rest of the sites that are covering the same niche.
-
panda it's only about the content (site speed, affiliate format, links etc are not taking into account - for those you have other filters and penalties)
So, fi you are 100% that Panda is to blame for your drop in rankings you need to compare your self with the competition first and see how can you be better then them.
Just put your self in Google shoes - if you have 10 sites on the same niche with more or less the same content you want to keep 1 or 2, populate the rest of the results with diverse results and move the rest -50 or whatever.
If you are not the one on the 1 or 2 set then you are one of the one that just got moved back - way back (down).
Spot the ones that are still up and try to improve the content in order to be better then those are have a different approach for the same niche / area in order to get a spot for the diverse spots. In this way you will get back.
First assess the situation and see if you can be better - in terms of content - then those that are still on top in visibility that are targeting more or less the same content as you do. If you can beat them - change your content strategy and approach the same content in a different format and flow in order to make it on the top results as one of the sites that are part of the diversity results.
Hope it helps. Is it clear or I am beating around the bush ?
Cheers !
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible to compete on keywords with Amazon?
Is it actually even possible to compete against Amazon to be #1 in Google SERPs against Amazon? If so - how? I run a boutique business selling a niche product, in 2008 - 2013 I was always #1 for my keywords.
Intermediate & Advanced SEO | | loginid
But since Amazon started the same type of products as well, I have now always been right under amazon results, who are at 1,2,3. Is it even possible to get to the #1 position any more? Thank you.0 -
Google Panda question Category Pages on e-commerce site
Dear Mates, Could you check this category page of our e-commerce site: http://tinyurl.com/zqjalng and give me your opinion about, this is a Panda safe page or not? Actually I have this as NOINDEX preventing any Panda hit, but I'm in doubt. My Question is "Can I index this page again in peace?" Thank you Clay
Intermediate & Advanced SEO | | ClayRey0 -
Website hit by something, but not sure if penguin or panda?
Hi my site was doing ok, last week, humming along and increase in traffic. We felt that all the work removing bad links had worked. then all of a sudden, bang traffic dropped and is still dropping day by day. The strange thing as well is all the social media, bing and yahoo traffic has also dried up! Has anyone else had something smiler?
Intermediate & Advanced SEO | | Taiger0 -
Product Pages & Panda 4.0
Greeting MOZ Community: I operate a real estate web site in New York City (www.nyc-officespace-leader.com). Of the 600 pages, about 350 of the URLs are product pages, written about specific listings. The content on these pages is quite short, sometimes only 20 words. My ranking has dropped very much since mid-May, around the time of the new Panda update. I suspect it has something to do with the very short product pages, the 350 listing pages. What is the best way to deal with these pages so as to recover ranking. I am considering these options: 1. Setting them to "no-index". But I am concerned that removing product pages is sending the wrong message to Google. 2. Enhancing the content and making certain that each page has at least 150-200 words. Re-writing 350 listings would be a real project, but if necessary to recover I will bite the bullet. What is the best way to address this issue? I am very surprised that Google does not understand that product URLs can be very brief and yet have useful content. Information about a potential office rental that lists location, size, price per square foot is valuable to the visitor but can be very brief. Especially listings that change frequently. So I am surprised by the penalty. Would I be better off not having separate URLs for the listings, and for instance adding them as posts within building pages? Is having separate URLs for product pages with minimal content a bad idea from an SEO perspective? Does anyone have any suggestions as to how I can recover from this latest Panda penalty? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Possible Penguin 2.1 fix - Anybody tested this?
Possible Penguin 2.1 fix? This happened to client site - Stay with me - this takes some explaining… A clients home page is set as index.html Which in domain settings goes to the root address: http://www.domain.com/ But is a setting on a domain/hosting - you can set any page to the root- I always link directly to the root address (the second one) So if you set the new root page as http://www.domain.com/index.htm --- going to the root - essentially is a new page- any previous poor linking would be then broken and would have no effect So it would be a matter of changing the domain settings to use the index.htm page (which would function exactly the same- internal link structure of site goes to the root) thoughts?
Intermediate & Advanced SEO | | OnlineAssetPartners0 -
Panda'd - and I think I know how to fix it...
Hi, I have a non-core site that seems to have been affected by a Panda refresh in late December http://www.seomoz.org/google-algorithm-change#2012 Anyway, I couldn't figure out for the longest time why this site, which is full of high-quality, expert-level content would get dinged -- i made several moves to try and eliminate duplicate content -- even though I couldn't find evidence of the duplicate content, but it's a wordpress site so there's lots of opportunities to accidentally introduce it through archives, tags and whatnot. The classic SEO mistake I was making was I was forgetting about a type of post we were doing to facilitate one of our email campaigns. On most, sites there's always something you aren't optimizing, and that's the stuff that can really create unintended issues in google, because the decisions made on those pieces, is often more operational toward the other campaigns, than strategic to search. these posts, are thin little articles, written by humans, but the text is actually submitted to another external site, published there and then recreated as content that the email campaign links to. These posts are segregated from the normal feed on the wordpress site, and the last time I had reviewed this content, we were not using a method for creating that involved publishing it to facebook first. But, OK, so I'm going to stop indexing this content, that's a given. I believe that is the Panda issue -- I could be wrong, but it makes sense, since otherwise the site is maybe the least likely site to be affected by Panda that I've ever been involved with. Do I do anything else, after fixing a Panda issue? Is there a reconsideration request for this or something. Should I send a singing telegram to Cutts? I researched a few articles, and there wasn't much on what to do after you fixed it, but to wait. Just wondering if anyone else who fixed a Panda thang, utilized any communication channel to let google know. thanks!
Intermediate & Advanced SEO | | reallygoodstuff0 -
Is it possible for one man to go against industry leaders in serps?
Hi, Is it possible for an individual to go against the big boys in a industry sector? Lets say flowers in the UK, all the massive flower companies go for 2 keywords: 'flowers delivered' (30,000 ems) and 'flowers by post' (30,000 ems) Would it be possible to start a new site and within 6 months (July) be up in the top 5? Scrap any exact match domains as there gone, I'm talking about creating a brand such as flowerpower.co.uk picking a term and going for it. Is this possible?
Intermediate & Advanced SEO | | activitysuper0 -
Major Website Migration Recovery Ideas?
Since starting our business back in 2006 we've gone through alot of branding, and as a result URL and architectual migrations. This has always something that has been driven by usability, brand awareness and technical efficiency reasons, while knowing that there would be SEO hits to take from it....but ultimately hoping to have a much stronger foundation from an SEO perspective in the long run. Having just gone through our most recent (and hopefully final) migration, we are now about 15% down on traffic (although more like 35% - 40% in real terms when seasonality is stripped out). Below is a timeline to our structural history: 2007 - 2009 = We operated as a network of inidividual websites which started as 1, www.marbellainfo.com, but grew to 40, with the likes of www.thealgarveinfo.com, www.mymallorcainfo.com, www.mytenerifeinfo.com, www.mymaltainfo.com etc.. 2009 - 2010 = We decided to consolitdate everything onto 1 single domain, using a sub-domain structure. We used the domain www.mydestinationinfo.com and the subdomains http://marbella.mydestinationinfo.com, http://algarve.mydestinationinfo.com etc.. All old pages were 301 redirected to like for like pages on the new subdomains. We took a 70% drop in traffic and SERPS disappeared for over 6 months. After 9 months we had recovered back to traffic levels and similar rankings to what we had pre-migration. Using this new URL structure, we expanded to 100 destinations and therefore 100 sub-domains. 2011 = In April 2011, having not learnt our lesson from before :(, we undwent another migration. We had secured the domain name www.mydestination.com and had developed a whole new logo and branding. With 100 sub-domains we underwent a migration to the new URL and used a sub-directory folder. So this time www.myalgarveinfo.com had gone to <a></a>http://algarve.mydestinationinfo.com and was now www.mydestination.com/algarve. No content or designs were changed, and again we 301 re-directed pages to like for like pages and with this we even made efforts to ask those linking to us to update their links to use our new URL's. The problem: The situation we fine ourselves in now is no where near as bad as what happend with our migration in 2009/2010, however, we are still down on traffic and SERPS and it's now been 3 months since the migration. One thing we had identified was that our re-directs where going through a chain of re-directs, rather than pointing straight to the final urls (something which has just been rectified). I fear that our constant changing of URL's has meant we have lost out in terms of the passing over of link juice from all the old URL's and loss of trust with Google for changing so much. Throughout this period we have grown the content on our site by almost 2x - 3x each year and now have around 100,000 quality pages of unique content (which is produced by locals on the ground in each destination). I'm hoping that someone in the SEOmoz Community might have some ideas on things we may have slipped up on, or ways in which we can try and recover a little faster and actually get some growth, as opposed to working hard and waiting a while just for another recovery. Thanks Neil
Intermediate & Advanced SEO | | Neil-MyDestination0