Panda and Large Web Presence
-
I'm experiencing some recent significant drops in rankings across the board for a client of mine and I suspect that it's probably related to Panda. Their internet presence features completely unique, useful, well written content by certified industry experts. Further, all content is of proper length and again serves a core purpose, providing helpful information to their viewers. Where I think things potentially go wrong is that they have around 20 micro sites in operation, including multiple web 2.0 blogs. There are also multiple sites in operation that target more specific areas of the same city. Again all of the content is unique, but they all feature content that's of the same industry and broad topic.
Despite everything being 100% unique, I fear it's too excessive. Anyone know if Panda may target this type of approach even if the quality and uniqueness is appropriate?
-
Panda updates have hit microsites where content across the sites was either duplicated or "thin", although thin is often in the eye of the beholder. Keep in mind, and I mean this kindly, that "unique" is not always high-quality, and the quest for technical uniqueness can lead to practices where microsites are just spinning out versions of content with slightly different keyword concepts or ordering, etc. In other words, it's technically "unique", but most people wouldn't view it as valuable.
Early Panda updates did hit certain kinds of spun-off content hard, including geo-located content. In other words, you spun out your plumbing services page for 5,000 cities and it only differed by city names and a few basic facts (even if technically unique), that's definitely something Panda came down hard on.
Truthfully, though, it's really tough to tell without specifics. I'm more on EGOL's side of the fence - my gut feeling is that 20 micro-sites is excessive and I'd strongly suspect quality issues.
Some questions that might help you pin things down:
(1) Has traffic dropped across the entire cluster of sites or just the main site?
(2) Can you pin traffic drops down to any given date, set of keywords, or pages? Drill down as far as you can - that's always the most important first step, IMO.
(3) Are some of your micro-sites essentially dead - no traffic or ROI? You might not have to go all-or-none here. Odds are that some small % of your micro-sites are creating a large % of your value (let's call it an 80/20 rule). It's likely you could kill 10-15 of them with very little harm - at least that's what I typically see. You don't have to drop all 20 cold-turkey.
-
Where I think things potentially go wrong is that they have around 20 micro sites in operation...
Did they built all of these outhouses because they thought they would be a source of "links" ?
The first thing that I would do is to be sure that the content that is in use on their site today, right now, is unique content that originated with the company. If that is not the case, then it is time to throw things overboard or noindex the items that are not original and unique. If everything is original and unique then I would get into an "improvement & consolidation" mode, pulling good content out of the outhouses, improve it to the point of being Great Content, and posting it on the main site.
Keep in mind that problems related to Panda, Penguin, or other algos occur when you are crossways with one or more Google Principles. These can be really hard to diagnose and require a full site audit requiring many hours, done be someone who really knows their stuff. What you will get here with a generalized question is not much more than kibitzin'.
-
Hi Jay,
Do you have any dates that you can refer to in Analytics that show drop that might coincide with a penalty / algorithm update?
-Andy
-
Thank you everyone. I agree as well that it isn't the right approach. Moving forward though it would be extremely beneficial to pinpoint the exact cause of this recent decrease in ranking. It's peculiar to witness strong and reliable gains prior to a significant drop across the board on the heels of this update.
Let's say someone is creating multiple pages that target minor variations of the same keyword. Using unique, but essentially re-written content for all pages. If this was all hosted on the same site it would then be a clear violation of Panda.
"Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?" - Amit
It would not be duplicate content but could be seen as redundant articles on similar topics.
However, if re-written content that's similar in scope is spread across multiple domains as opposed to being hosted on the same site, would it not fall into the same Panda category?
-
I agree with Andy, your description of the setup sounds pretty excessive. Plus, just because content is unique and professionally written doesn't mean that it's high quality. If the sites all say the same thing but in different ways, then none of them are contributing anything meaningful. And your branding is diffused across a zillion different sites to boot.
-
Hi Jay,
Anyone know if Panda may target this type of approach even if the quality and uniqueness is appropriate?
No, this doesn't sound to me like Panda at all.
You mention they have microsites and blogs in operation - presumably this has been done to try and rank for additional phrases? I can't see many other reasons why this would be done.
My opinion here is to pull both the microsites and blogs back in and just create a blog on their own site (if they don't already have one). I wouldn't bother 301ing any external sites / posts back to those they might want to re-published on their current site either. You need to be advising them to start from scratch and ditch the chaff. If these external sites have all had a part to play in their current problems, then I would just distance yourself from them altogether.
...they all feature content that's of the same industry and broad topic
When looking at their own site, you need to also be advising them not to create blogs posts for the sake of it. Rather than creating 4-5 articles a week, tell them to create just 1 or two really high quality (and longer) articles weekly.
I hope this helps.
-Andy
-
Hi Jay,
Its a difficult question to answer however I can point you in a direction John Mueller of Google Switzerland has a hangout on Fridays at his g+ hang out below You can pose the question to him at times if he cant get an answer he will come back to you. Hope this helps
https://plus.google.com/+JohnMueller/posts
https://sites.google.com/site/webmasterhelpforum/en/office-hours
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Improve Core Web Vitals?
As per Google webmaster my LCP score is low. Please suggest how to improve it. My website URL is https://www.bigrock.in/.
Technical SEO | | wstodayservices0 -
Launching large content project - date-stamp question
Hello mozzers! So my company is about to launch a large scale content project with over 100 pieces of newly published content. I'm being asked what the date-stamp for each article should be. Two questions:
Technical SEO | | Vacatia_SEO
1- Does it hurt article's SEO juice to have a lot of content with the same "published on" date?
2- I have the ability to manually update each articles date stamp. Is there a recommended best practice? p.s. Google has not crawled any of these pages yet.1 -
Bingbot appears to be crawling a large site extremely frequently?
Hi All! What constitutes a normal crawl rate for daily bingbot server requests for large sites? Are any of you noticing spikes in Bingbot crawl activity? I did find a "mildly" useful thread at Black Hat World containing this quote: "The reason BingBot seems to be terrorizing your site is because of your site's architecture; it has to be misaligned. If you are like most people, you paid no attention to setting up your website to avoid this glitch. In the article referenced by Oxonbeef, the author's issue was that he was engaging in dynamic linking, which pretty much put the BingBot in a constant loop. You may have the same type or similar issue particularly if you set up a WP blog without setting the parameters for noindex from the get go." However, my gut instinct says this isn't it and that it's more likely that someone or something is spoofing bingbot. I'd love to hear what you guys think! Dana
Technical SEO | | danatanseo1 -
Number of index pages in web master is different from site:mydomainname
Google says one to discover whether my pages is index in Google is site:domain name of my website: https://support.google.com/webmasters/answer/34444?hl=enas mention in web page above so basically according to that i can know totally pages indexed for my website right:it shows me when type (site:domain name ) 300 but it says in Google web master that i have 100000so which is the real number of index page 300 or 1000000 as web master says and why i get 300 when using site:domain name even Google mention that it is way to discover index paged
Technical SEO | | Jamalon0 -
When i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
when i type site:jamalon.com to discover number of pages indexed it gives me different result from google web master tools
Technical SEO | | Jamalon0 -
$360 charged to embed 2 youtube video clips on web page with CMS system - Realistic?
Hi All, I have just had a bill from our webmaster / SEO provider for 3 hours work to embed two youtube video clips on to one page of our site (http://www.compoundsecurity.co.uk/raidervision-visual-verification-module-use-rdas-and-vx-gprs-wireless-security-systems) Now i am not an HTML programmer, but i have the ability to insert HTML code in to our eBay pages to embed videos where i want them on a page and it takes less than a minute. Would love to get feedback from several people on how long this should have taken. Is it really anywhere near 3 hours work. Everything else on the page was already there. If it means anything, the site is done in Apach Cheers
Technical SEO | | DaddySmurf0 -
Have I been mauled by the Panda?
I mentioned my problem a few nights ago, but since then I think I may have found my problem. I have a site that was never really promoted to any great level, but for the main keywords I could find it in serps with out clicking though too many pages. I came up first for the company name in both Bing or Google. Recently I finally decided to promote it and did a bit of a ranking check. First I still come up first for my company name in Bing, but in Google I come up 900+ out of 1000, virtually last. For my main keyword, the title of my site and optimized well for, I come up last, absolutely last. For long tail terms from my home page where I am the only site in the world to have the exact term, I come first in Bing, and absolutely last in Google. I don’t do black hat, but I thought I must be flagged by Google and I asked for reconsideration, they replied that no manual actions had been taken again the site, and referred me to the usual Google guidelines. It was very frustrating, I then had a thought, I had a long forgotten sub domain that had a load of duplicate content, it was a load of Microsoft documentation and other dev stuff from other sources, rss feeds and the like. Nothing sinister, but duplicate all the same. I am now thinking that this maybe my problem. I have 410’ed the whole sub domain as the site has not been maintained for some time anyhow. Does anybody know of simular, sub domain causing loss of ranking for root domain
Technical SEO | | AlanMosley0 -
Google Web Master Tools - Keyword Variants & misspelling
We have millions of urls and the technical expertise to write code to fix the spelling of keyword variants Google has discovered and shows us in Web Master tools. Since Google has recognized these as variants, is it worth our time to write code that will fix the spelling of obvious misses?
Technical SEO | | snoopcat0