Panda Prevention Plan (PPP)
-
Hi SEOMOzers,
I'm planning to prepare Panda deployment, by creating a check-list from thinks to do in SEO to prevent mass trafic pert.
I would like to spread these ideas with SEOMoz community and SEOMoz staff in order to build help ressources for other marketers.
Here are some ideas for content website :
- the main one is to block duplicate content (robots.txt, noindex tag, according to the different canonical case)
- same issue on very low quality content (questions / answers, forums), by inserting canonical redirect or noindex on threads with few answers
-
Thanks for your help.
Big content website, as official they are, are always on prey from another panda tweak. Google send this panda message to prepare website using too much UGC, forums to make the difference between their official high quality content, and the SEO content on the way to become official.
That was the reason I need to build up a Panda Prevention Plan
-
Prepare original, high quality content and claim it with rel="author".
-
Personally, I don't think Panda is something that is ever going to happen again. There are always going to be algo tweaks, but this was something different.
I think it's always wise to block duplicate content and improve the quality of your content, but then this is just good practice all around!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is robots met tag a more reliable than robots.txt at preventing indexing by Google?
What's your experience of using robots meta tag v robots.txt when it comes to a stand alone solution to prevent Google indexing? I am pretty sure robots meta tag is more reliable - going on own experiences, I have never experience any probs with robots meta tags but plenty with robots.txt as a stand alone solution. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart1 -
Website still not recovered from Panda # 20 (Sep 2012 update)
Hi everyone,My website was hit by Panda around the 27th September 2012 (Panda # 20 or EMD) , since then, it's no longer in Google search results for a particular keyword [wallpapers], resulting in a massive sudden traffic drop (-90%) (see the screenshot below).Despite my best efforts auditing my links, identifying unnatural backlinks, disavowing bad links, enhancing my website content, improving user experience... (I even ended up with a completely revamped website: new design, new structure and new content), I didn't see any improvement! Can you please look at It and Advise me? I am ready to give up; I am in deep despair.What are my competitors doing better than me? Competitor #1 Competitor #2Thank you in advance - I appreciate your timeMy website: http://goo.gl/maaxazLroAvD5.jpg
Intermediate & Advanced SEO | | Spinodza0 -
With the New Panda update supposedly only weeks away, is it wise to No Index my products I have not had time to rewrite the product descriptions for ?
Hi Mozzers, I read on SEJ yesterday than apparently the Panda update was due in the 2 - 4 weeks. I still have a large of my products which I have not got around to rewriting unique product descriptions for. I know these product descriptions are duplicated on other affiliate sites so do it think it in light of the panda update coming , would it wise to put a NO INDEX Meta tag on these product pages until I get around to rewriting the descriptions. That way, I may not hit my Panda and it will buy me a bit more time. Just an idea , but thought I'd run it by. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
How to know website is hit with panda or penguin?
My Website traffic and keywords dropped day by day. How can I know website is hit with panda or penguin. Website is - 24hourpassportandvisas. com
Intermediate & Advanced SEO | | bondhoward0 -
How do i prevent Google and Moz from counting pages as duplicates?
I have 130,000 profiles on my site. When not Connected to them they have very few differences. So a bot - not logged in, etc, will see a login form and "Connect to Profilename" MOZ and Google call the links the same, even though theyre unique such as example.com/id/328/name-of-this-group example.com/id/87323/name-of-a-different-group So how do i separate them? Can I use Schema or something to help identify that these are profile pages, or that the content on them should be ignored as its help text, etc? Take facebook - each facebook profile for a name renders simple results: https://www.facebook.com/public/John-Smith https://www.facebook.com/family/Smith/ Would that be duplicate data if facebook had a "Why to join" article on all of those pages?
Intermediate & Advanced SEO | | inmn0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
SEO timeline / action plans
Hi, My first question here so thanks in advance for any responses. I have built websites / done basic seo for a while but I am now making a real concerted effort to implement everything I learn from seomoz. I am wondering if there is any advice or general timeline of items to implement with existing websites or brand new websites. For example xxx action in week 1, xxx action in week 2 etc A sort of crib sheet or tick lists which I think would help reduce any errors and help schedule items for future planning, especially in larger groups of people . If there is anything or anywhere that might be of help then would be great or if anyone has a rough plan they stick too
Intermediate & Advanced SEO | | wtfi0 -
Why specify robots instead of googlebot for a Panda affected site?
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
Intermediate & Advanced SEO | | nicole.healthline0