I think Panda was a conspiracy.
-
It's just a theory, but I think that Panda was not really an algorithm update but rather a conspiracy.
Google went out of their way to announce that a new algorithm was being rolled out. The word on the street was that content farms would be affected. Low quality sites would be affected. Scrapers would be affected. So, everyone with decent sites sat back and said, "Ah...this will be good...my rankings will increase."
And then, the word started coming in that some really good sites took a massive hit. We've got a lot of theories on what could be causing the hit, but there doesn't seem to be an obvious fix.
Many of the key factors that have been suggested causes of a site to look bad in Panda's eyes are present on one of my sites, but this site actually increased in rankings after Panda.
So, this is my theory: I think that Google made some random changes that made no sense. They made changes that would cause some scraper sites to go down but they also knew that many decent sites would decline as well.
Why would they do this? The result is fantastic in Google's eyes. They have the whole world of web design doing all they can to create the BEST quality site possible. People are removing duplicate content, reducing ad clutter and generally creating the best site possible. And this, is the goal of Larry Page and Sergey Brin...to make it so that Google gives the user the BEST possible sites to match their query.
I think that a month or so from now there will be a sudden shift in the algo again and many of those decent sites will have their good rankings back again. The site owners will think it's because they put hard work into creating good quality, so they will be happy. And Google will be happy because the web is a better place.
What do you think?
-
hahahaha. I agree with you. First Google manipulates results to see if Bing is copying. Soon after this update comes with all the care that is creating its own content sites optimized for the robots and we can not think of duplicate content. hehehehe
-
Actually I really liked the "content registry" idea.
An library of content where you could register what you have created and, optional, a link to where you want to be considered the main source.
At least it would be 10x more usefull than the google knol idea..
-
I would pay a fee to protect my best content in the Google SERPs.
-
Actually you have a point there... if JC Penny was indeed the catalyst (which I could easily imagine it being) then the time between that and the update would surely mean it would have to have been rushed. I never considered that before.
-
ha ha... I think they did rush this out.... they were quickly trying to pull up their pants after getting embarassed from the JCPenny problem... they needed to bust a few heads quickly...
-
Ha ha, maybe
I think it's something infinitely less planned out and they simply rushed this change out the door without understanding fully what it would do to the SERPs.
Although I do think you're right that in a few months (in what will be claimed to be a second Panda sweep) that things will go back and only the very worst offenders will stay penalised.
-
Yes I like the content registry idea! It would probably be necessary to pay for it as a service though, and to cover dupes that are okay maybe they could just allow dupes as long as they reference back to the source in the registry (for news, quotations, etc... where dupes can't be avoided).
-
Interesting ideas. Thanks for sharing them.
I think that Google is talking a lot about this as a "quality website update"... and that is getting them attention in the media but it is also kicking a lot of webmasters in the butt to clean up their websites.
I think that google should make a "content registry" where I can submit my content and say "this is mine" and then copies or spins of that content will not get traction in the SERPs.
And, I think that they should take a closer look at websites in the Adsense program because the ability to monetize crap and theft is driving lot of bad odor in the SERPs.
-
Haha I like it!!
Well, if it's not what happened, they'll wish they thought of it anyway lol
My view on why other sites got hit is just that they had at least some links coming from sites that got hit... i.e. got 100 backlinks, 10 are from articles on article sites, article sites get hit... lose 10 backlinks (or at least lose some of the value from some of those backlinks)... hence, good site takes a hit too
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is your hypothesis why Panda/Penguin recoveries happen over months after an algorithm update rather than over night?
We have experienced many scenarios were ranking recoveries from clear Panda and Penguin penalties on our sites don't necessarily happen with the launch of a Panda/Penguin update but instead trickle back in over weeks and months after a confirmed algo update. A good example is shown in the image which shows a panda recovery for a high volume keyword. What is your theory why these ranking recoveries happen over weeks vs instantly? qCWliLF
Algorithm Updates | | italiansoc0 -
Panda 4.0 Suggestions
My site was hit pretty negatively by Panda 4.0 and I am at a loss for the best way to address it. I have read like every article that I can and I know there is some duplicate manufacturer product descriptions but I don't hear many other ecoms complaining about Panda so I figure it must be something else. Also, the pages that seem most negatively affected are category and product list pages. Any help or suggestions would be much appreciated. Thanks! http://bit.ly/1plgOzM
Algorithm Updates | | Gordian0 -
Post penguin & panda update. what would be a good seo strategies for brand new sites
Hi there. I have the luxury of launching a few sites after the penguin and panda updates, so I can start from scratch and hopefully do it right. I will get SEO companies to help me with this so i just want to ask for advices on what would be a good strategies for a brand new site. my understand of the new updates is this content and user experience is important, like how long they spend, how many pages etc social media is important. we intent to engage FB and twitter alot. in New Zealand, not too many people use google+ so we will probbaly just concentrate on the first two hopefully we will try to get people to share our website via social media, apparent that is important should only concentrate on high quality backlinks with a good diverse set of alt tags, but concentrate on branding rather than keywords. Am i correct to say that so far? if that is the principle, what would be the strategy to implement these goals? Links to any articles would also be great please. Love learning. i just want to do this right and hopefully try to future proof the sites against updates as possible. i guess quality content and links will most likely to be safe. Thank you for your help.
Algorithm Updates | | btrinh0 -
I think my inbound link anchor text looks un-natural to google - How to fix?
Hi all, For a bit of back ground see this question i posted recently: http://www.seomoz.org/q/lost-over-65-of-organic-visits-since-sept-please-help From the responses there and looking into my backlinks and my competitors i can see an issue with the anchor text on my inbound links... nearly all keywords and very very few brand names etc... From what i can gather (using open site explorer) the page in question has: 1100 inbound links from 900 domains These use 90 different anchor texts 106 of these links use my brand / website name in the anchor text These 106 links are spread over 18 domains (73 from 1 directory) About 5-10% of the links are from directories, the rest are from what i would describe as "proper websites" From my very limited knowledge of this, the issue is my brand / website should have a far higher ratio of links using it as the anchor text then any keyword... which as you can see from the above is not the case... If it wasnt for that 1 directory there would only be 33 links with my brand from over 1000... I need to start fixing this, but was wondering how to start... Below are a list of options i could try, i have no idea if these would help or hinder, any advice you could give on the potential affects of below options would be very helpful: Options (the below are hypothetical, i have no idea if i will be able to get it done - Just thinking out loud here): Get as many as possible of the "directory" links removed Remove keywords from 50-60% of links and replace with branding Or Try to add branding to 50-60% of the anchor texts something like [Brand] + [keyword] Forget about whats been done previously / changing it will not help in anyway / and focus on branding in anchor text for any future link building? Thanks James
Algorithm Updates | | isntworkdull0 -
Considering the Panda algorithm updates, would you recommend reducing high amounts of inbound links from a single website?
My website has a significant number of inbound links (1,000+) from a single website, due to a sponsorship level contribution. Both my website and the other are authorities in the industry and in search results (PR of 5). Since even ethical websites can suffer a penalty from each iteration of Panda, I'm considering significantly removing the number of links from this website. Do you think that measurable change would be seen favorably by Google or would the drop in links be detrimental?
Algorithm Updates | | steelintheair0 -
What do you think Google analyzes for SERP ranking?
I've been doing some research trying to figure out how the Google algorithm works. The one thing that is constant is that nothing is constant. This makes me believe that Google takes a variable that all sites have and divides it by that number. One example would be taking the load time in MS and dividing it by the total number or points the website scored. This would give all of the websites a random appearance since there that variable would throw off all the other constants. I'm going to continue doing research but I was wondering what you guys think matters in the Google Algorithm. -Shane
Algorithm Updates | | Seoperior0 -
Panda Update 2.5
All right mozers... What do you think? Apparently Google has just realized the next wave of "Panda" .... I'd love to hear your experiences with the new Panda Update. Have you experienced any decline in organic traffic?
Algorithm Updates | | NerdsOnCall0 -
Another Panda update?
Our high quality (IMHO!) editorial technology site got hit by around ~20% of Google search referrals when Panda went live in the UK on the 12th. But the last couple of days it's returned to "normal". Is that the effects of our frantic scrabbling around trying things, or have Google tweaked the algorithm again, to remove the editorial sites which got caught accidentally?
Algorithm Updates | | StuartAnderton0