Yet another Panda question
-
Hi Guys,
I'm just looking for confirmation on something.....
In the wake of Panda 2.2 one of my pages has plummeted in the rankings whilst other similar pages have seen healthy improvements.
Am I correct in thinking that Panda effects individual pages and doesn't tar an entire site with the same brush?
Really I'm trying to see if Panda is the reason in the drop on one page or whether it could be something else. The page in question has dropped 130 positions - not just a general fluctuation.
Thanks in advance for your responses!!!
-
Did you add any links to that particular page? I have seen a handful of links with similar anchor text published at the same time kill a ranking.
-
Elias, as a pro member, you do get one private Q&A question a month. You can submit your question with URL details to private Q&A and only SEOmoz staff members and associates can view the question, and it won't appear in any searches or indexes or be visible to anybody besides the staff and associates.
-
hmm without more info it's hard, like hip-shooting on one hand, blindfolded. But maybe you could check the bounce rate ex. in Analytics to see if your visitors also find the content good. (If they feel like browsing on after they read the landing page.)
Just because you find it high quality doesn't mean that your users or google agrees
-
yeah all content is good and original. There is a canonical in place to avoid duplication - very confused - maybe it will just sort itself out!
Thanks for your help
-
have your checked for duplicate content on site and off site or if the page is getting indexed with 2 different url's?
The second page could also be poor quality content meaning, almost no text ex..But it is kinda hard to point you in the right direction without more info
-
I'd love to but it is a bit sensitive. There is no difference between the coding of the too pages or the structure. The only difference is the on-page content and perhaps some internal links.
-
There could be quite a few reasons, there is no short answer for that.
could you show us 2 examples? 1 page that has been penalized and one that haven't?
-
Thanks, I'm kind of leaning towards the problem not being panda related.
Has anybody ever experienced such major drops for any other reason?
-
I do believe that the common consensus is that the panda affects the entire domain if it penalizes pages then some of that penalty will rub off on the rest of the domain.
My advice take the penalised pages down while you rework them. That should solve your problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site has disappeared since Panda 4 despite quality content, help!
Our site www.physicalwellbeing.co.uk has lost over 20 first page rankings since the end of May. I assume this is because of Panda 4.0. All content on the site is high quality and 100% unique, so we did not expect to get penalised. Although I read somewhere that if Google can't read particular js anymore they don't rank you as high. The site has not been blacklisted as all pages are showing in Google's index and there are no messages on webmaster tools. We have not taken part in any link schemes and have disavowed all low quality links that were pointing there just in case (after the penalty). Can anybody see anything on www.physicalwellbeing.co.uk that may have cause Panda update to affect it so negatively? Would really appreciate any help.
Algorithm Updates | | search_shop0 -
Sitemap Question
Hello, I have a website and my sitemap (generated by the Yoast plugin) is set up into three different sections. One thing I noticed was that my homepage isn't in my sitemap. Is this an issue? The homepage is indexed, but does it need to be added to the sitemap in order for it to be crawled? How would I go about adding the homepage to the sitemap?
Algorithm Updates | | WebServiceConsulting.com0 -
International foreign language SEO questions
I'm looking to add some foreign language pages to a website and have a lot of international SEO questions. I think the overall question is can you do SEO yourself if you are a native English speaker for a language you don't speak (like Chinese)? 1. How do you go about doing keyword research for a foreign language? What tools are available? 2. How do you know what search engines you should optimize for in a different country? And where can you find the technical SEO requirements for each? I'm wondering things like title tag length for Baidu. Or is the Title length different for Yahoo Japan vs. US? Do you write titles and meta tags in Chinese/Japanese for respective countries? Etc.
Algorithm Updates | | IrvCo_Interactive0 -
External Linking Best Practices Question
Is it frowned upon to use basic anchor text such as "click here" within a blog article when linking externally? I understand, ideally, you want to provide a descriptive anchor text, especially linking internally, but can it negatively affect your own website if you don't use a descriptive anchor text when linking externally?
Algorithm Updates | | RezStream80 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
Can AJAX implementation affect the rankings in Google Panda?
Hi there, I have the following situation with one of our job sites. We migrate the site to a new application, which is better from design point of view and also usability. For this we use a lot AJAX especially in searches. So every time a user is filtering down their search new results will be shown on the page, at the same url and with no page load. But, having this implementation. affected Bounce rate - which increased from 38% to nearly 60%, PI/visits - which are now half, at 3 and also Avg Time on Site is half that is used to be coming to 2,5 min from nearly 6 min. From Rand post, it is clearly that the content is very important in Google Panda, and all of these parameters we should consider, as it is telling the quality of the content. So, my question will be, can this site be hit by Panda updates (maybe later on) because Bounce Rate, PI/Visits and Avg Time on site, decreased in such way? At the moment we don't measure the Ajax impresion, but as I understood that we can do that though virtual pages in GA, does anyone of you have the experience how to handle this? Won't be this an artificial increase? Thanks, Irina
Algorithm Updates | | InformMedia0 -
I think Panda was a conspiracy.
It's just a theory, but I think that Panda was not really an algorithm update but rather a conspiracy. Google went out of their way to announce that a new algorithm was being rolled out. The word on the street was that content farms would be affected. Low quality sites would be affected. Scrapers would be affected. So, everyone with decent sites sat back and said, "Ah...this will be good...my rankings will increase." And then, the word started coming in that some really good sites took a massive hit. We've got a lot of theories on what could be causing the hit, but there doesn't seem to be an obvious fix. Many of the key factors that have been suggested causes of a site to look bad in Panda's eyes are present on one of my sites, but this site actually increased in rankings after Panda. So, this is my theory: I think that Google made some random changes that made no sense. They made changes that would cause some scraper sites to go down but they also knew that many decent sites would decline as well. Why would they do this? The result is fantastic in Google's eyes. They have the whole world of web design doing all they can to create the BEST quality site possible. People are removing duplicate content, reducing ad clutter and generally creating the best site possible. And this, is the goal of Larry Page and Sergey Brin...to make it so that Google gives the user the BEST possible sites to match their query. I think that a month or so from now there will be a sudden shift in the algo again and many of those decent sites will have their good rankings back again. The site owners will think it's because they put hard work into creating good quality, so they will be happy. And Google will be happy because the web is a better place. What do you think?
Algorithm Updates | | MarieHaynes3 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0