Do I have a Panda filter on a specific segment?
-
Our site gets a decent level of search traffic and doesn't have any site-wide penalty issues, but one of our sections looks like it might be under some form of filter. Unfortunately for us, they're our buy pages!
Check out http://www.carwow.co.uk/deals/Volkswagen/Golf it's unique content and I've built white hat links into it, including about 5 from university websites (.ac.uk domains DA70+). If you search something like "volkswagen golf deals" the pages on page 1 have weak thin content and pretty much no links.
That content section wasn't always unique, in fact the vast majority of it may well be classed as dupe content as there's no Trim data and they look like this: http://www.carwow.co.uk/deals/Fiat/Punto
While we never had much volume, the traffic on all /deals/ pages appears to drop significantly around the time of the May Panda update (4.0).
We're planning on completely re-launching these pages with a new design, unique trim content and a paragraph (c.200 words) about the model.
Am I right in assuming that there's a Panda filter on the /deals/ segment so regardless of what I do to one deals page it won't rank well, and we have to re-do the whole section?
-
It is still possible it isn't a penalty from one of the major algorithms and you may be able to solve this by creating a strong internal linking strategy. It helps to formulate one if you use something like MindNode to create an overview of the site and then you can drill in on the pages in questions.
It is possible that a noindex would cure this, but it all depends because even though you add a noindex tag to a page, Google can still read the page and apply the penalty. All it means is that the page won't be indexed.
However, if you are relaunching everything very soon, you might be as well to sit tight and not do anything too rash for a short-term solution.
-Andy
-
Positions as well.
Some form of filter is the only explanation I can think of for why that VW Golf Deals page doesn't perform. It's better content and has decent links (OSE hasn't picked them up but they're there).
We get c.40k hits/month on our blog and c.25k hits/month on our car-reviews, entirely through organic, but literally zero on the deals pages, where if anything the competition is less and the quality is lower.
I wonder if placing a no-index tag on the deals pages that have thin content would resolve the issue, but we'll be re-launching the whole segment in the coming weeks.
-
Hi James,
Am I right in assuming that there's a Panda filter on the /deals/ segment
Unfortunately there is no guaranteed way to say this is the case, but generally if you see a drop in traffic / positions that coincide with an algorithm refresh, then this can be telling.
Is it just traffic to those pages that has dropped, or positions in the SERPs?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Filter Content By State Selection and SEO Consideratoins
I have an insurance client that is represented in three states. They need to present different information to users by state identification. They prefer to have one page with all the information and then present the information relevant to the state by the users selection from a pop up window. Spiders will be able to index all the content. Users will only see the content based on their selection. So, I wanted to ask the Moz community what SEO implication could this have? The information available on the web is very thin with this situation so really appreciate any guidance that can be given...thanks,
Intermediate & Advanced SEO | | Liamis0 -
Panda, rankings and other non-sense issues
Hello everyone I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G: http://www.virtualsheetmusic.com/ Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while. Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it. Take for example this competitors of ours: http://8notes.com They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them. Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first. As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st. So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around. Thoughts???!!!
Intermediate & Advanced SEO | | fablau0 -
Do industry specific domains help SEO?
Hi everyone, I've looked for an answer to this but I can't find one. Hopefully someone can help! I have a new client that is a builder. They currently have a .co.uk domain (e.g. businessname.co.uk) Would it help them if the website was businessname_.builders_ instead? Thanks, Alex
Intermediate & Advanced SEO | | WebsiteAbility0 -
Impact of May 2015 quality update and July Panda update on specialty brands or niche retail
We are seeing the following trend in our rankings and traffic after the recent Google algorithm updates (May 2015 quality/phantom, and July 2015 Panda), and I am curious if anyone here has encountered similar and/or has any good ideas on how to react. Background - we operate in a niche segment, but compete for keywords with large home improvement stores and mass retailers. In the past, prior to May 2015, we generally ranked higher than the large home improvement stores and mass retailers for our key specific terms in our niche. We believed that it was because we have a very specialized focus and so our store was highly relevant for someone searching in that niche (for example for the name of the product category as a keyword). In general, we ranked #1-3. Along with a few of our competitors in our niche. And then would be the big box home improvement stores in spots 5-10. The change we saw starting in May is that now all the home improvement stores and also a few large multi-category retailers took over those top 5 spots and bumped all the specialty retailers and the specialty brand manufacturers (like us) down. Our direct competitors in our niche all seem to have been impacted pretty much the same as us. So, in summary it seems like these latest updates may have favored the more general retailers but with stronger domain authority than the more specific but smaller retailers. Hard to know for sure, but this is the trend we believe we see. So, that said, what are some good strategies to respond to this situation? We can't really compete on overall domain authority with these huge retailers. And our previously successful strategy of having a very focused niche, with lots of helpful content, videos, instructional guides, etc. no longer seems to be enough. Has anyone else seen similar results since this past May? Where specialty retail or brand sites lost ground to larger general retailers? And if so, has anyone found any good strategies to gain back their previous rankings, or at least partially?
Intermediate & Advanced SEO | | drewk1 -
Has there been a 'Panda' update in the UK?
My site in the UK suddenly dropped from page 1 and out of top 50 for all KWs using 'recliner' or a derivative. We are a recliner manufacturer and have gained rank over 15 years, and of course using all white hat tactics. Did Google make an algo update in the Uk last week?
Intermediate & Advanced SEO | | KnutDSvendsen0 -
Why specify robots instead of googlebot for a Panda affected site?
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
Intermediate & Advanced SEO | | nicole.healthline0 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1 -
How to prevent Google from crawling our product filter?
Hi All, We have a crawler problem on one of our sites www.sneakerskoopjeonline.nl. On this site, visitors can specify criteria to filter available products. These filters are passed as http/get arguments. The number of possible filter urls is virtually limitless. In order to prevent duplicate content, or an insane amount of pages in the search indices, our software automatically adds noindex, nofollow and noarchive directives to these filter result pages. However, we’re unable to explain to crawlers (Google in particular) to ignore these urls. We’ve already changed the on page filter html to javascript, hoping this would cause the crawler to ignore it. However, it seems that Googlebot executes the javascript and crawls the generated urls anyway. What can we do to prevent Google from crawling all the filter options? Thanks in advance for the help. Kind regards, Gerwin
Intermediate & Advanced SEO | | footsteps0