Do I have a Panda filter on a specific segment?
-
Our site gets a decent level of search traffic and doesn't have any site-wide penalty issues, but one of our sections looks like it might be under some form of filter. Unfortunately for us, they're our buy pages!
Check out http://www.carwow.co.uk/deals/Volkswagen/Golf it's unique content and I've built white hat links into it, including about 5 from university websites (.ac.uk domains DA70+). If you search something like "volkswagen golf deals" the pages on page 1 have weak thin content and pretty much no links.
That content section wasn't always unique, in fact the vast majority of it may well be classed as dupe content as there's no Trim data and they look like this: http://www.carwow.co.uk/deals/Fiat/Punto
While we never had much volume, the traffic on all /deals/ pages appears to drop significantly around the time of the May Panda update (4.0).
We're planning on completely re-launching these pages with a new design, unique trim content and a paragraph (c.200 words) about the model.
Am I right in assuming that there's a Panda filter on the /deals/ segment so regardless of what I do to one deals page it won't rank well, and we have to re-do the whole section?
-
It is still possible it isn't a penalty from one of the major algorithms and you may be able to solve this by creating a strong internal linking strategy. It helps to formulate one if you use something like MindNode to create an overview of the site and then you can drill in on the pages in questions.
It is possible that a noindex would cure this, but it all depends because even though you add a noindex tag to a page, Google can still read the page and apply the penalty. All it means is that the page won't be indexed.
However, if you are relaunching everything very soon, you might be as well to sit tight and not do anything too rash for a short-term solution.
-Andy
-
Positions as well.
Some form of filter is the only explanation I can think of for why that VW Golf Deals page doesn't perform. It's better content and has decent links (OSE hasn't picked them up but they're there).
We get c.40k hits/month on our blog and c.25k hits/month on our car-reviews, entirely through organic, but literally zero on the deals pages, where if anything the competition is less and the quality is lower.
I wonder if placing a no-index tag on the deals pages that have thin content would resolve the issue, but we'll be re-launching the whole segment in the coming weeks.
-
Hi James,
Am I right in assuming that there's a Panda filter on the /deals/ segment
Unfortunately there is no guaranteed way to say this is the case, but generally if you see a drop in traffic / positions that coincide with an algorithm refresh, then this can be telling.
Is it just traffic to those pages that has dropped, or positions in the SERPs?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Panda, rankings and other non-sense issues
Hello everyone I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G: http://www.virtualsheetmusic.com/ Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while. Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it. Take for example this competitors of ours: http://8notes.com They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them. Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first. As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st. So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around. Thoughts???!!!
Intermediate & Advanced SEO | | fablau0 -
How to do Country specific indexing ?
We are a business that operate in South East Asian countries and have medical professionals listed in Thailand, Philippines and Indonesia. When I go to Google Philippines and check I can see indexing of pages from all countries and no Philippines pages. Philippines is where we launched recently. How can I tell Google Philippines to give more priority to pages from Philippines and not from other countries Can someone help?
Intermediate & Advanced SEO | | ozil0 -
Product Pages & Panda 4.0
Greeting MOZ Community: I operate a real estate web site in New York City (www.nyc-officespace-leader.com). Of the 600 pages, about 350 of the URLs are product pages, written about specific listings. The content on these pages is quite short, sometimes only 20 words. My ranking has dropped very much since mid-May, around the time of the new Panda update. I suspect it has something to do with the very short product pages, the 350 listing pages. What is the best way to deal with these pages so as to recover ranking. I am considering these options: 1. Setting them to "no-index". But I am concerned that removing product pages is sending the wrong message to Google. 2. Enhancing the content and making certain that each page has at least 150-200 words. Re-writing 350 listings would be a real project, but if necessary to recover I will bite the bullet. What is the best way to address this issue? I am very surprised that Google does not understand that product URLs can be very brief and yet have useful content. Information about a potential office rental that lists location, size, price per square foot is valuable to the visitor but can be very brief. Especially listings that change frequently. So I am surprised by the penalty. Would I be better off not having separate URLs for the listings, and for instance adding them as posts within building pages? Is having separate URLs for product pages with minimal content a bad idea from an SEO perspective? Does anyone have any suggestions as to how I can recover from this latest Panda penalty? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Regular Expressions for Filtering BOT Traffic?
I've set up a filter to remove bot traffic from Analytics. I relied on regular expressions posted in an article that eliminates what appears to be most of them. However, there are other bots I would like to filter but I'm having a hard time determining the regular expressions for them. How do I determine what the regular expression is for additional bots so I can apply them to the filter? I read an Analytics "how to" but its over my head and I'm hoping for some "dumbed down" guidance. 🙂
Intermediate & Advanced SEO | | AWCthreads1 -
Redirect 1 specific link from a referrer, wise move?
This might be a bit of a complex question as it has many implications and I’d really appreciate some expert advice. I have a client who has one specific .gov link which is absolutely amazing. It doesn’t have any anchor text and goes straight to the homepage. However it drives quite a lot of revenue! It’s lovely as SEO’s to think of links actually driving revenue rather than assisting rankings of course. Over the last 30 days this single govt. link has driven: 418 clicks, 61 conversions, $7,500 revenue Not bad I hear you say! J I really want to extract maximum value from this and I’ sure it would perform far better if it went to a really tightly focused landing page rather than the homepage. It really is hyper-targeted buyer traffic (as the stats show) but could perform even better in my opinion. I can’t change the link, that’s not possible unfortunately. However I can do some server side stuff to redirect traffic from just this link to a desired new page. However, what are the SEO implications of this in the opinion of some of the experts here? Obviously the link itself is valuable from an SEO perspective and I don’t want to lose that. I’m not sure how Google would treat the link if this were to be done. Also I really want to A/B test the homepage versus a landing page to ensure that it really does give an improvement. I’m not even sure how to achieve that given the difficulty in this situation. Any advice would really be useful! Thanks J
Intermediate & Advanced SEO | | HarveyP0 -
What is the best way to allow content to be used on other sites for syndication without taking the chance of duplicate content filters
Cookstr appears to be syndicating content to shape.com and mensfitness.com a) They integrate their data into partner sites with an attribution back to their site and skinned it with the partners look. b) they link the image back to their image hosted on cookstr c) The page does not have microformats or as much data as their own page does so their own page is better SEO. Is this the best strategy or is there something better they could be doing to safely allow others to use our content, we don't want to share the content if we're going to get hit for a duplicate content filter or have another site out rank us with our own data. Thanks for your help in advance! their original content page: http://www.cookstr.com/recipes/sauteacuteed-escarole-with-pancetta their syndicated content pages: http://www.shape.com/healthy-eating/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta
Intermediate & Advanced SEO | | irvingw
http://www.mensfitness.com/nutrition/healthy-recipes/recipe/sauteacuteed-escarole-with-pancetta0 -
How to determine the correct number of ad units post-Panda
What guidelines are you using to determine the correct number of ad units? Also is it number of units per page or the size of the ads (visually)? Any additional guidance or links you can point me to regarding ads in a post-Panda world would be helpful.
Intermediate & Advanced SEO | | nicole.healthline0