Do I have a Panda filter on a specific segment?
-
Our site gets a decent level of search traffic and doesn't have any site-wide penalty issues, but one of our sections looks like it might be under some form of filter. Unfortunately for us, they're our buy pages!
Check out http://www.carwow.co.uk/deals/Volkswagen/Golf it's unique content and I've built white hat links into it, including about 5 from university websites (.ac.uk domains DA70+). If you search something like "volkswagen golf deals" the pages on page 1 have weak thin content and pretty much no links.
That content section wasn't always unique, in fact the vast majority of it may well be classed as dupe content as there's no Trim data and they look like this: http://www.carwow.co.uk/deals/Fiat/Punto
While we never had much volume, the traffic on all /deals/ pages appears to drop significantly around the time of the May Panda update (4.0).
We're planning on completely re-launching these pages with a new design, unique trim content and a paragraph (c.200 words) about the model.
Am I right in assuming that there's a Panda filter on the /deals/ segment so regardless of what I do to one deals page it won't rank well, and we have to re-do the whole section?
-
It is still possible it isn't a penalty from one of the major algorithms and you may be able to solve this by creating a strong internal linking strategy. It helps to formulate one if you use something like MindNode to create an overview of the site and then you can drill in on the pages in questions.
It is possible that a noindex would cure this, but it all depends because even though you add a noindex tag to a page, Google can still read the page and apply the penalty. All it means is that the page won't be indexed.
However, if you are relaunching everything very soon, you might be as well to sit tight and not do anything too rash for a short-term solution.
-Andy
-
Positions as well.
Some form of filter is the only explanation I can think of for why that VW Golf Deals page doesn't perform. It's better content and has decent links (OSE hasn't picked them up but they're there).
We get c.40k hits/month on our blog and c.25k hits/month on our car-reviews, entirely through organic, but literally zero on the deals pages, where if anything the competition is less and the quality is lower.
I wonder if placing a no-index tag on the deals pages that have thin content would resolve the issue, but we'll be re-launching the whole segment in the coming weeks.
-
Hi James,
Am I right in assuming that there's a Panda filter on the /deals/ segment
Unfortunately there is no guaranteed way to say this is the case, but generally if you see a drop in traffic / positions that coincide with an algorithm refresh, then this can be telling.
Is it just traffic to those pages that has dropped, or positions in the SERPs?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mobile Site Panda 4.2 Penalty
We are an ecommerce company, and we outsource our mobile site to a service, and our mobile site is m.ourdomain.com. We pass the Google mobile ready test. Our product page content on the mobile site is woefully thin (typically less than 100 words), and it appears that we got hit with Panda 4.2 on the mobile site. Starting at the end of July, our mobile rankings have dropped, and our mobile traffic is now about half of what it was in July. We are working to correct the content issue but it obviously takes time. So here's my question - if our mobile site got hit with Panda 4.2, could that have a negative effect on our desktop site?
Intermediate & Advanced SEO | | AMHC0 -
Using Webmaster Tools to Redirect Domain to Specific Page on Another Domain
Hey Everyone, we redirected an entire domain to a specific URL on another domain (not the homepage). We used a 301 Redirect, but I'm also wondering if I should use the Google Webmaster Tools "Change of Address" section to redirect. There is no option to redirect the old domain to the specific URL on the new domain within the "Change of Address" section. Thoughts?
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Moving Part of a Website to a Subdomain to Remove Panda Penalty?
I have lots of news on my website and unlike other types of content, news posts quickly become obsolete and get a high bounce rate. I have reasons to think that the news on my website might be partly responsible for a Panda penalty so I'm not sure. There are over 400 news posts on the blog from the last 4 years so that's still a lot of content. I was thinking of isolating the news articles on a subdomain (news.mywebsite.com) If the news play a part in the Panda penalty, would that remove it from the main domain?
Intermediate & Advanced SEO | | sbrault740 -
Our quilting site was hit by Panda/Penguin...should we start a second "traffic" site?
I built a website for my wife who is a quilter called LearnHowToMakeQuilts.com. However, it has been hit by Panda or Penguin (I’m not quite sure) and am scared to tell her to go ahead and keep building the site up. She really wants to post on her blog on Learnhowtomakequilts.com, but I’m afraid it will be in vain for Google’s search engine. Yahoo and Bing still rank well. I don’t want her to produce good content that will never rank well if the whole site is penalized in some way. I’ve overly optimized in linking strongly to the keywords “how to make a quilt” for our main keyword, mainly to the home page and I think that is one of the main reasons we are incurring some kind of penalty. First main question: From looking at the attached Google Analytics image, does anyone know if it was Panda or Penguin that we were “hit” by? And, what can be done about it? (We originally wanted to build a nice content website, but were lured in by a get rich quick personality to rather make a “squeeze page” for the Home page and force all your people through that page to get to the really good content. Thus, our avenge time on site per person is terrible and Pages per Visit is low at: 1.2. We really want to try to improve it some day. She has a local business website, Customcarequilts.com that did not get hit. Second question: Should we start a second site rather than invest the time in trying to repair the damage from my bad link building and article marketing? We do need to keep the site up and running because it has her online quilting course for beginner quilters to learn how to quilt their first quilt. We host the videos through Amazon S3 and were selling at least one course every other day. But now that the Google drop has hit, we are lucky to sell one quilting course per month. So, if we start a second site we can use that to build as a big content site that we can use to introduce people to learnhowtomakequilts.com that has Martha’s quilting course. So, should we go ahead and start a new fresh site rather than to repair the damage done by my bad over optimizing? (We’ve already picked out a great website name that would work really well with her personal facebook page.) Or, here’s a second option, which is to use her local business website: customcarequilts.com. She created it in 2003 and has had it ever since. It is only PR 1. Would this be an option? Anyway I’m looking for guidance on whether we should pursue repairing the damage and whether we should start a second fresh site or use an existing site to create new content (for getting new quilters to eventually purchase her course). Brad & Martha Novacek rnUXcWd
Intermediate & Advanced SEO | | BradNovi0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
What has this subdomain done to recover from Panda?
I found that doctor.webmd.com was affected by Google Panda, and then recovered (if you look at traffic on compete.com). What do you think they did to recover?
Intermediate & Advanced SEO | | nicole.healthline0 -
Filter after 301 and linked with high PR
Hi, I'd like to ask you what should I do in my situation. I've shorted my URLs from something like this: domain.com/module/action/type/id/keyword to this: domain.com/keyword After 301 SERP refreshed and position stayed the same (yea, lucky me :). After 2 days I got some hight PR links (4 and 5). After 8 days my new URL disapprear to one keyword. Now this take 6 days... I've removed these links and still no results. So the question is - what should I do? Remove new url and replace it with old one, get new links?
Intermediate & Advanced SEO | | sui0