Was Panda applied at sub-domain or root-domain level?
-
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine?
What's the general consensus on whether this was applied at the sub-domain or root-domain level?
My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda.
I'd love to hear your thoughts and opinions though?
-
Hmm, I definitely think there is a page level element but Google have gone on record to say there's also a domain level impact.
I imagine that the definition Google holds of a "site" is used in this case rather than slapping the whole domain.
-
This may sound weird, but I think the answer to your question is no. Panda and its subsequent rollouts hit a few of my websites. But it seemed to me it was a page level thing, not a domain level thing. For example, Panda seemed to target some affiliate websites. I have lots of affiliate websites. Some of their rankings were untouched, while other pages suffered. Granted, any site that got hit by Panda got hit pretty hard. But oddly, several pages on those sites were completely unaffected.
To your point there, blogspot.com, wordpress.com and tumblr.com all have several sub-domains that were affected without the root domains getting affected, so what you are asking is if blog.mysite.com got hit, then did mysite.com get hit as well?
I think that just as pages rank and not domains, so were pages affected by Panda and not whole domains or sub-domains.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub-domain with spammy content and links: Any impact on main website rankings?
Hi all, One of our sub-domains is forums. Our users will be discussing about our product and many related things. But some of the users in forum are adding a lot of spammy content everyday. I just wonder whether this scenario is ruining our ranking efforts of main website? A sub domain with spammy content really kills the ranking of main website? Thanks
Algorithm Updates | | vtmoz0 -
Paying a premium or going with a hyphenated domain
Q1. I have two domains I am interested and the are available, however, they are "premium" at $1,000! According to GoDaddy, another domain site, just said available through action.
Algorithm Updates | | brandon_solutions
Any recommendations? If I have to pay, is there an auction site or broker you would recommend or another approach to try to get it at a lower price? Q2. The hyphenated domains are available and affordable. However some of the information I gathered on MOZ mentions Google flags it as spam. I believe that article was dated 2014. Is that still a risk or has Google through AI or something else determine spam another way now?0 -
Dublicate Content: Almost samt site on different domains
Hi, I own a couple of casting websites, which I'm at the moment launching "local" copies of all over the world. When I launch my website in a new country, the content is basically allways the same, except the language sometimes changes country for country. The domains will vary, so the sitename would be site.es for Spain, site.sg for Singapore, site.dk for Denmark and so. The websites will also feature diffent jobs (castings) and diffent profiles on the search.pages and so, BUT the more static pages are the same content (About us, The concept, Faq, Create user and so). So my Questions are: Is this something that is bad for Google SEO? The sites are atm NOT linking to each other with language-flags or anything - Should I do this? Basically to tell google that
Algorithm Updates | | KasperGJ
the business behind all these sites are somewhat big. Is there a way to inform Google on, that these sites should NOT be treated as dublicate content (Canonical tag wont do, since I want the "same" content to be listet on the locally Google sites). Hope there is some experts here which can help. /Kasper0 -
Domains dominating SERPs w/multiple listings
I know Cutts addressed this as a potential future update to the Google algo but it's driving me bonkers.. My primary targeted keyword has one of our competitors listed 4 times in a row on the top of page 2. Some of the pages have duplicate page titles and the content is relatively thin. The site has a PR of 2 and a DA of 35. Why on earth are they able to suck up a whole half of a results page?!?!?! I don't know that there's anything anyone can tell me that will help, but if there's something I missed about this update please let me know. 'snot fair. 😞
Algorithm Updates | | jesse-landry0 -
Redirected old domain to new, how long before seeing the external links under the new domain?
Before contracting SEO services, my client decided to change his established root domain to one more customer-friendly. Since he had no expertise on board, no redirects were set up until 6 months later. I ran stats right before the old domain was redirected and have a report showing that he had roughly 750 external links from 300 root domains. We redirected the old domain to the new domain in mid Jan 2012. Those external links are still not showing in Open Site Explorer for the new domain. I've tested it a dozen times, and the old domain definitely points to the new domain. How long should it take before the new domain picks up those external links? Should I do anything else to help the process along?
Algorithm Updates | | smsinc0 -
Google SERPS problem - "block all results from this domain - click here".
Anyone know what can be done about this when it happens to one of your own domains? On the Google SERPS page, underneath the Title, next to the Description, Google has added "Block all results from this domain?". I understand that this is a new "feature", aimed at allowing users to filter out results from low quality, pornograhphic or offensive sites. But the site in question is none of the above - any ideas how to tackle? Couldn't find anything yet by searching.
Algorithm Updates | | Understudy0 -
Keyword rich domains sliding fast
I decided not to worry too much about the statements from google indicating that they were going to consider key word rich domains as a negative for ranking since any of the sites I work on that have them are totally relevant to the content on the sites. However, since recent Google algorithm updates I see these domains have suddenly slid from top 3 positions to page 4 or beyond in Google SERP's. Nothing has changed on these sites in the intervening time and no change is evident in Bing or Yahoo SERP's. Is it just my imagination, or are others seeing the same thing for keyword rich domains? and has anyone yet determined the best way to deal with this problem?
Algorithm Updates | | ShaMenz0 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0