Panda Prevention Plan (PPP)
-
Hi SEOMOzers,
I'm planning to prepare Panda deployment, by creating a check-list from thinks to do in SEO to prevent mass trafic pert.
I would like to spread these ideas with SEOMoz community and SEOMoz staff in order to build help ressources for other marketers.
Here are some ideas for content website :
- the main one is to block duplicate content (robots.txt, noindex tag, according to the different canonical case)
- same issue on very low quality content (questions / answers, forums), by inserting canonical redirect or noindex on threads with few answers
-
Thanks for your help.
Big content website, as official they are, are always on prey from another panda tweak. Google send this panda message to prepare website using too much UGC, forums to make the difference between their official high quality content, and the SEO content on the way to become official.
That was the reason I need to build up a Panda Prevention Plan
-
Prepare original, high quality content and claim it with rel="author".
-
Personally, I don't think Panda is something that is ever going to happen again. There are always going to be algo tweaks, but this was something different.
I think it's always wise to block duplicate content and improve the quality of your content, but then this is just good practice all around!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Phasing in new website on www2 domain - 301 plan
Hi, I work for a large company and we're planning to phase in a new website. The idea is to develop key journeys on the new site and serve them on a www2 domain, removing them from the old website which is served on the www domain. The reason for this is because the old website is over 2,000 pages, and the management want to see new, improved journeys sooner rather than later. So, rather than launching all new pages and journeys at the same time, which will take a long time to design and develop, key journeys will move across to the new site / design sooner and made available to visitors. Whilst the overall journey might be a bit disjointed in parts (i.e. sending people from old to new site, and vice versa) I can't see a better way of doing it... Once all new content is complete, 301s will be implemented from old content on www. to new content www2. Once the phasing is complete, and all new content is in place on www2, 301s will be implemented to point everything back to www. Does anybody see any problems with this approach? Or any ideas on how to better handle this situation? Thanks Mozzers!
Intermediate & Advanced SEO | | RWesley0 -
Ecombuffet.com are offering a Rescue Review focused on Panda - Penguin and identifying issues. Has anyone used this service or aware of the organisation in general?
http://www.ecombuffet.com/rescue-review.htm . I have 2 sites that have definitely been hit by penguin and getting worse so am thinking of paying for this service as nothing I do seems to stop the slide (more like a plummet). Any comments welcome.
Intermediate & Advanced SEO | | Shaann1 -
Sub domain will not index - Next plan of action?
I'm not sure exactly what option i should take next. but i'll run you through a few points: The page is optimized to a rank "A" The page has 350 backlinks* a strong social presence Interlinking pages. High domain authority an OK page authority The domain ranks highly Every other sub domain rank highly. I make a search and the first page that ranks for this domain is a product page within the exact sub domain i'm trying to rank for, followed by some external blogs I've written and then the rest of the product pages. I've submitted the URL to web master tools twice and yet it still will not rank for that keyword. The only time i see the page index is if i copy the exact URL into Google. Any help on this would be greatly appreciated. Thanks
Intermediate & Advanced SEO | | Martin_Harris0 -
What About Google Panda Update 22?
Maybe I haven't found the threads or whatever but I haven't seen lots of posts about the latest Google Panda update from November 21-22 on SEOmoz. Panda 22 is not even listed here: http://www.seomoz.org/google-algorithm-change Until November 21st, Google killed 3 of 5 websites I own through their Panda updates (never got hit by Penguin updates as I got only original content), accounting for about 25% of my income. Fortunately, the 2 remaining websites gained more traffic throughout the summer of 2012 so my income almost got back to 100% even though I got the "Unnatural Links" warning in Google Webmaster Tools in July. Since then, I did a huge link cleanup and according to the Link Detox Tool (from another SEO service), the number of "toxic links" went from about 350 to 50. Back link reports is as follow: 8% (52) Toxic Links; 57% (382) Suspicious Links; 35% (235) Healthy Links; Out of the 382 suspicious, most of them are coming from the same domain and they are all directories to which my website has been submitted automatically (not using any specific keyword anchor). On the opposite, healthy links are coming from different domains so I like to think they have a stronger impact than suspicious links. That said, my two remaining websites were still doing well until November 21 where it got hit by the Panda. Now traffic has dropped by 55% and income has dropped by 75% (yes I'll have to look for a job within a year if I don't fix this). (I want to add that none of my websites are "thin websites". One has over 1500 pages of content and the other has about 500 pages. All websites have content added 3 to 5 times a week.) What I don't get is that all my "money keywords" are still ranked in the top 10 results on Google according to multiple tools / services I use, yet the impressions dropped from 50% to 75% for those keywords?!? I have a feeling that this time it's not only a drop in ranking. There's a drop in impressions caused by something else. Is it caused by emphasis on local search? Are they showing more ads and less organic results? But here's the "funny part": For the last 5 years, I was never able to advertise my website on Google Adwords. Each time, I got a quality score of about 4/10 only to see it drop to 1/10 within a few hours of launching the campaign. On November 22nd, I build new PPC campaigns based on the exact same PPC campaigns I had the past (same keywords, same ads, same landing pages). Guess what? Now the quality score is between 7/10 and 10/10 (most of them have 10/10) for the exact same PPC campaign! What a "coincidence" huh?
Intermediate & Advanced SEO | | sbrault740 -
Recovery Steps For Panda 3.5 (Rel. Apr. 19, 2012)?
I'm asking people who have recovered from Panda to share what criteria they used - especially on sites that are not large scale ecommerce sites. Blog site hit by Panda 3.5. Blog has approximately 250 posts. Some of the posts are the most thorough on the subject and regained traffic despite a Penguin mauling a few days after the Panda attack. (The site has probably regained 80% of the traffic it lost since Penguin hit without any link removal or link building, and minimal new content.) Bounce rate is 80% and average time on page is 2:00 min. (Even my most productive pages tend to have very high bounce rates BUT those pages maintain time on page in the 4 to 12 minute range.) The Panda discussions I've read on these boards seem to focus on e-commerce sites with extremely thin content. I assume that Google views much of my content as "thin" too. But, my site seems to need a pruning instead of just combiining the blue model, white model, red model, and white model all on one page like most of the ecommerce sites we've discussed. So, I'm asking people who have recovered from Panda to share what criteria they used to decide whether to combine a page, prune a page, etc. After I combine any series articles to one long post (driving the time on page to nice levels), I plan to prune the remaining pages that have poor time on page and/or bounce rates. Regardless of the analytics, I plan to keep the "thin" pages that are essential for readers to understand the subject matter of the blog. (I'll work on flushing out the content or producing videos for those pages.) How deep should I prune on the first cut? 5% ? 10% ? Even more ? Should I focus on the pages with the worst bounce rates, the worst time on page, or try some of both? If I post unique and informative video content (hosted on site using Wistia), what I should I expect for a range of the decrease in bounce rate ? Thanks for reading this long post.
Intermediate & Advanced SEO | | JustDucky0 -
Were small sites hit by Panda?
It seems that primarily large sites were hit by Panda, but does any one know of / own a small site that was hit by Panda?
Intermediate & Advanced SEO | | nicole.healthline0 -
How to compete with duplicate content in post panda world?
I want to fix duplicate content issues over my eCommerce website. I have read very valuable blog post on SEOmoz regarding duplicate content in post panda world and applied all strategy to my website. I want to give one example to know more about it. http://www.vistastores.com/outdoor-umbrellas Non WWW version: http://vistastores.com/outdoor-umbrellas redirect to home page. For HTTPS pages: https://www.vistastores.com/outdoor-umbrellas I have created Robots.txt file for all HTTPS pages as follow. https://www.vistastores.com/robots.txt And, set Rel=canonical to HTTP page as follow. http://www.vistastores.com/outdoor-umbrellas Narrow by search: My website have narrow by search and contain pages with same Meta info as follow. http://www.vistastores.com/outdoor-umbrellas?cat=7 http://www.vistastores.com/outdoor-umbrellas?manufacturer=Bond+MFG http://www.vistastores.com/outdoor-umbrellas?finish_search=Aluminum I have restricted all dynamic pages by Robots.txt which are generated by narrow by search. http://www.vistastores.com/robots.txt And, I have set Rel=Canonical to base URL on each dynamic pages. Order by pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name I have restrict all pages with robots.txt and set Rel=Canonical to base URL. For pagination pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name&p=2 I have restrict all pages with robots.txt and set Rel=Next & Rel=Prev to all paginated pages. I have also set Rel=Canonical to base URL. I have done & apply all SEO suggestions to my website but, Google is crawling and indexing 21K+ pages. My website have only 9K product pages. Google search result: https://www.google.com/search?num=100&hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&biw=1366&bih=520 Since last 7 days, my website have affected with 75% down of impression & CTR. I want to recover it and perform better as previous one. I have explained my question in long manner because, want to recover my traffic as soon as possible.
Intermediate & Advanced SEO | | CommercePundit0 -
Blog - on the domain or place on separate site, now that Panda ranks for bounce, TOP, depth of visit
Over 10 years ago, we decided to run our blog external to our main website. contrary to conventional wisdom then, we thought we’d have more control/opps for generating external anchor text links, plus working in a bona fide blog software environment (WP). As we had hoped, the blog generated alot of strong inbound links, captured inbound links of it own from other sites and I think, helped improve our SERPs and traffic. Once the blog was established and with the redesign of the website, we capitulated, and finally moved the blog onto the main domain. After reading a number of pieces on Panda and the new reality of SEO, sounds like bounce rates (in particular), time on page, and other GA measures may have a more profound influence on google rankings now. Given that blogs are notoriously for high bounce rates (ours is), low time on site, depth of visit, seems logical that it adversely affects our site averages for the main domain). Is it time to re-consider pulling our blog off the main domain to reassert the ‘true’ GA measures of the main domain? I guess it still gets down to the question... is the advantage of all the inbound links to the blog on the main domain of greater value than moving the blog off-site and reasserting better 'site stats' for google's pando algo? Thanks.
Intermediate & Advanced SEO | | ahw0