I think Panda was a conspiracy.
-
It's just a theory, but I think that Panda was not really an algorithm update but rather a conspiracy.
Google went out of their way to announce that a new algorithm was being rolled out. The word on the street was that content farms would be affected. Low quality sites would be affected. Scrapers would be affected. So, everyone with decent sites sat back and said, "Ah...this will be good...my rankings will increase."
And then, the word started coming in that some really good sites took a massive hit. We've got a lot of theories on what could be causing the hit, but there doesn't seem to be an obvious fix.
Many of the key factors that have been suggested causes of a site to look bad in Panda's eyes are present on one of my sites, but this site actually increased in rankings after Panda.
So, this is my theory: I think that Google made some random changes that made no sense. They made changes that would cause some scraper sites to go down but they also knew that many decent sites would decline as well.
Why would they do this? The result is fantastic in Google's eyes. They have the whole world of web design doing all they can to create the BEST quality site possible. People are removing duplicate content, reducing ad clutter and generally creating the best site possible. And this, is the goal of Larry Page and Sergey Brin...to make it so that Google gives the user the BEST possible sites to match their query.
I think that a month or so from now there will be a sudden shift in the algo again and many of those decent sites will have their good rankings back again. The site owners will think it's because they put hard work into creating good quality, so they will be happy. And Google will be happy because the web is a better place.
What do you think?
-
hahahaha. I agree with you. First Google manipulates results to see if Bing is copying. Soon after this update comes with all the care that is creating its own content sites optimized for the robots and we can not think of duplicate content. hehehehe
-
Actually I really liked the "content registry" idea.
An library of content where you could register what you have created and, optional, a link to where you want to be considered the main source.
At least it would be 10x more usefull than the google knol idea..
-
I would pay a fee to protect my best content in the Google SERPs.
-
Actually you have a point there... if JC Penny was indeed the catalyst (which I could easily imagine it being) then the time between that and the update would surely mean it would have to have been rushed. I never considered that before.
-
ha ha... I think they did rush this out.... they were quickly trying to pull up their pants after getting embarassed from the JCPenny problem... they needed to bust a few heads quickly...
-
Ha ha, maybe
I think it's something infinitely less planned out and they simply rushed this change out the door without understanding fully what it would do to the SERPs.
Although I do think you're right that in a few months (in what will be claimed to be a second Panda sweep) that things will go back and only the very worst offenders will stay penalised.
-
Yes I like the content registry idea! It would probably be necessary to pay for it as a service though, and to cover dupes that are okay maybe they could just allow dupes as long as they reference back to the source in the registry (for news, quotations, etc... where dupes can't be avoided).
-
Interesting ideas. Thanks for sharing them.
I think that Google is talking a lot about this as a "quality website update"... and that is getting them attention in the media but it is also kicking a lot of webmasters in the butt to clean up their websites.
I think that google should make a "content registry" where I can submit my content and say "this is mine" and then copies or spins of that content will not get traction in the SERPs.
And, I think that they should take a closer look at websites in the Adsense program because the ability to monetize crap and theft is driving lot of bad odor in the SERPs.
-
Haha I like it!!
Well, if it's not what happened, they'll wish they thought of it anyway lol
My view on why other sites got hit is just that they had at least some links coming from sites that got hit... i.e. got 100 backlinks, 10 are from articles on article sites, article sites get hit... lose 10 backlinks (or at least lose some of the value from some of those backlinks)... hence, good site takes a hit too
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What do you think of SearchMetrics' claim that there are no longer universal ranking factors?
I agree that Google's machine learning/AI means that Google is using a more dynamic set of factors to match searcher intent to content, but this claim feels like an overstatement: Let’s be quite clear: Except for important technical standards, there are no longer any specifc factors
Algorithm Updates | | AdamThompson
or benchmark values that are universally valid for all online marketers and SEOs. Instead, there
are different ranking factors for every single industry, or even every single search query. And these
now change continuously. Keyword-relevant content, backlinks, etc. still seem to be ranking factors across pretty much all queries/industries. For example, I can't think of a single industry where it would be a good idea to try to rank for [keyword] without including [keyword] in the visible text of the page. Also, websites that rank without any backlinks are incredibly rare (unheard of for competitive terms). Doubtless some factors change (eg Google may favor webpages with images for a query like "best hairstyle for men" but not for another query), but other factors still seem to apply to all queries (or at least 95%+). Thoughts?0 -
Site has disappeared since Panda 4 despite quality content, help!
Our site www.physicalwellbeing.co.uk has lost over 20 first page rankings since the end of May. I assume this is because of Panda 4.0. All content on the site is high quality and 100% unique, so we did not expect to get penalised. Although I read somewhere that if Google can't read particular js anymore they don't rank you as high. The site has not been blacklisted as all pages are showing in Google's index and there are no messages on webmaster tools. We have not taken part in any link schemes and have disavowed all low quality links that were pointing there just in case (after the penalty). Can anybody see anything on www.physicalwellbeing.co.uk that may have cause Panda update to affect it so negatively? Would really appreciate any help.
Algorithm Updates | | search_shop0 -
Do you think Google is destroying search?
I've seen garbage in google results for some time now, but it seems to be getting worse. I was just searching for a line of text that was in one of our stories from 2009. I just wanted to check that story and I didn't have a direct link. So I did the search and I found one copy of the story, but it wasn't on our site. I knew that it was on the other site as well as ours, because the writer writes for both publications. What I expected to see was the two results, one above the other, depending on which one had more links or better on-page for the query. What I got didn't really surprise me, but I was annoyed. In #1 position was the other site, That was OK by me, but ours wasn't there at all. I'm almost used to that now (not happy about it and trying to change it, but not doing well at all, even after 18 months of trying) What really made me angry was the garbage results that followed. One site, a wordpress blog, has tag pages and category pages being indexed. I didn't count them all but my guess is about 200 results from this blog, one after the other, most of them tag pages, with the same content on every one of them. Then the tag pages stopped and it started with dated archive pages, dozens of them. There were other sites, some with just one entry, some with dozens of tag pages. After that, porn sites, hundreds of them. I got right to the very end - 100 pages of 10 results per page. That blog seems to have done everything wrong, yet it has interesting stats. It is a PR6, yet Alexa ranks it 25,680,321. It has the same text in every headline. Most of the headlines are very short. It has all of the category and tag and archive pages indexed. There is a link to the designer's website on every page. There is a blogroll on every page, with links out to 50 sites. None of the pages appear to have a description. there are dozens of empty H2 tags and the H1 tag is 80% through the document. Yet google lists all of this stuff in the results. I don't remember the last time I saw 100 pages of results, it hasn't happened in a very long time. Is this something new that google is doing? What about the multiple tag and category pages in results - Is this just a special thing google is doing to upset me or are you seeing it too? I did eventually find my page, but not in that list. I found it by using site:mysite.com in the search box.
Algorithm Updates | | loopyal0 -
Did we get hit by Panda? What do we do?
Hello, here's our site: nlpca(dot)com We had a big drop in rankings, going from about 19th to about 43rd for our main keyword and having significant drops in other keywords. This happened roughly 6 weeks ago We thought it was being caused by either: Placing keywords in titles before we had them in the content. or Trying to rank for Utah keywords - we're the NLP Institute of California and we are in both places now, but the site talks about mainly California. We changed both these things, and we're still at the low rankings. Will we move back up? What do we do? Will a backlink campaign be effective at this point?
Algorithm Updates | | BobGW0 -
Can AJAX implementation affect the rankings in Google Panda?
Hi there, I have the following situation with one of our job sites. We migrate the site to a new application, which is better from design point of view and also usability. For this we use a lot AJAX especially in searches. So every time a user is filtering down their search new results will be shown on the page, at the same url and with no page load. But, having this implementation. affected Bounce rate - which increased from 38% to nearly 60%, PI/visits - which are now half, at 3 and also Avg Time on Site is half that is used to be coming to 2,5 min from nearly 6 min. From Rand post, it is clearly that the content is very important in Google Panda, and all of these parameters we should consider, as it is telling the quality of the content. So, my question will be, can this site be hit by Panda updates (maybe later on) because Bounce Rate, PI/Visits and Avg Time on site, decreased in such way? At the moment we don't measure the Ajax impresion, but as I understood that we can do that though virtual pages in GA, does anyone of you have the experience how to handle this? Won't be this an artificial increase? Thanks, Irina
Algorithm Updates | | InformMedia0 -
Has anyone recovered from Panda?
My two websites were unaffected by the original and 2.0 panda updates, but istring in June my traffic has been down around 30%. In analyzing it appears that my long tail searches have been greatly impacted. So it looks like I am a victim of the mighty panda.My main site, www.uncontesteddivorce-nyc.com is in my opinion a decent looking site, with unique content and no ads, etc., but for whatever reason it has been negatively affected. There might be some duplication of content between certain pages and also my links are all or practically all directory links, though a lot are pretty heavy duty directories. I see a lot of stuff written giving advice on how to recover from Panda. Has anyone actually done so? How did you do it? thx Paul
Algorithm Updates | | diogenes1 -
Another Panda update?
Our high quality (IMHO!) editorial technology site got hit by around ~20% of Google search referrals when Panda went live in the UK on the 12th. But the last couple of days it's returned to "normal". Is that the effects of our frantic scrabbling around trying things, or have Google tweaked the algorithm again, to remove the editorial sites which got caught accidentally?
Algorithm Updates | | StuartAnderton0 -
Classifieds and Google Panda
It seems Google's Panda update is targetting low quality sites with little unique content (I know there's more to it than that). It makes sense that they may want to do this but what about classified sites. They may use some scraped content as well as unique ads, and the ads may lack content as they rely on the users writing the ads. However, they are helpful to the people that use classifieds. Because of these factors, these sites are suffering with the release of the latest Panda update. Any advice for classified sites and how they can combat the rankings drops???
Algorithm Updates | | Sayers0