Hit hard by Panda 3.3 and Penguin. What to do?
-
Hi there.
I work with a company that was originally all white hat, then began to dabble in some pretty serious black hat activities last year (usually paid linking in private blog networks). At the time we saw tremendous results - many of our most highly competitive keywords shot up 20, 30 positions to the top 10. And they didn't seem to budge so long as we kept those (very expensive) links intact.
Alongside all of this, we have had a lot of white hat activity going on (pretty much everything recommended by Google/SEO Moz is ALSO in effect on this domain - lots of consistent/relevant blogging, social media, good content, good on-site SEO, etc), which I attribute to SOME of our success with keyword ranking, but what really made the difference was the paid linking. Let's just say we had two different mindsets behind the SEO strategy of the company, and the "Get rich quick" one worked for a while. Now, it doesn't. (Can you guess if I'm the white hat or the black hat at the company?)
So here's my question. I have made the effort to contact all of the webmasters of our egregious links and, as everyone else has described, it is effectively useless. Especially given the amazing post by Ryan Kent on this question (http://www.seomoz.org/q/does-anyone-have-any-suggestions-on-removing-spammy-links) I have sort of given up on the strategy of contacting these webmasters on a case by case basis and asking for the links to be removed, especially if Google is not going to accept anything less than a perfect backlink portfolio. It is LITERALLY IMPOSSIBLE to clean up these links.
Meanwhile, this company is a big name in a very competitive online market and it really needs to see lead generation from organic SEO. (Please don't give me any told-you-so's here, it was out of my hands.)
MY QUESTION IS:
WHAT SHOULD WE DO? Should we just keep the domain going and focus on only building quailty links from now on? Most of our keywords fall anywhere from position 40 to position 150 right now, so it's not like ALL hope is lost. But as any SEO knows that is basically as good as not being indexed at all.
OTHER OPTION: We have an old domain that is the less-SEO-friendly, but it is the official name of our company . com, and this domain is currently 301'd to our live (SEO-friendly) domain. The companyname.com domain is also older than our SEO friendly domain. Should we manually move our site back over to the old domain since there is no penalty on it? It seems like a lot of sites that are ranking are brand new anyway (except their URL's are loaded with keywords.)
Blah, I know that was a lot, but I'm feeling lost and ANY insight would be helpful.
Thanks as always SEOMoz!!
-
Thank you Rand. I also think this is the best idea. Really appreciate the help.
-
I've not seen penalties transfer via the 301 very often (in fact, I've only heard stories of it but never seen it confirmed with a public example). I'd probably do the 301 - as you said, it's not a great experience otherwise for visitors who bookmarked or get referred to the old domain.
If you're really nervous, you could create a message that shows up on the site and refers visitors to the new location, but that's a lot of extra work, and requires that extra click, which isn't great for UX.
I suppose if you're sure Google is going to pass the penalty, you could use the 301, but robots.txt block the site from being accessed, so Google wouldn't actually see the site being moved over (thus, it would show Google you're doing this purely for UX and not for SEO).
-
Wow, a response from Rand! I'm honored :-D. Thank you for your input.
You're definitely right about Google "scaring" people into White Hat SEO and I think they were very effective in that sense.
I'm actually going to be moving onto a new (strictly white hat) marketing company but I need to come up with a future plan for this current (penalized) site.
If I advise this company to rebuild a website using the old domain, what would you suggest as far as redirecting the current (penalized) domain? I've heard a 301 redirect transfers the penalty to the new site. But I do anticipate that there will be a good number of visitors landing on the penalized site. Should I build a page that doesn't redirect but tells users "Please visit "newdomain.com" to learn more about our company" ? Or should we have both sites live simultaneously and just create all new content so as to avoid the duplicate content issue? Any suggestions?
Thank you all.
-
I think this is exactly what Google hoped would happen with the Penguin update - SEOs and marketers who invested in gray/black hat links would have such an utterly horrific time trying to dig out that it would scare a broad swath of the industry into more white hat territory. Whether that's actually working is arguable, but it was certainly a goal of the update.
If you are ready to make the move over to the old domain, I wouldn't stop you. However, if you've built up some valuable brand equity, visitor loyalty and marketing prowess outside of SEO on this site, there's a few other possibilities:
- Work hard on UX and UI. Google hates penalizing beautiful sites that visitors love, and if you do get a manual review, this can help.
- Make the content truly exceptional, too. Ensure that there's nothing that feels like artificial/manipulative/done-just-for-rankings stuff on the site. Again, this makes it more likely that any reconsideration request will work
- Send out as many requests for link removal as possible and include the lists of where/how you acquired links and how you've tried to remove them in your reconsideration request
- Hope and pray
This process might not get you back in, but it could work. Google's requiring a "good faith" effort and some proof of said effort, but there's a possibility your site might get by. For the future, I'd strongly recommend sticking to entirely editorially given/earned links.
Wish you luck!
-
if you're having to contact webmasters, I would bet at least some of them are getting bombarded by a lot of similar requests. If it's on sites of the type I am figuring we're having to deal with (low quality sites that were designed with no care, regard or concern for anything other than low-level SEO), they're very unlikely to care about people harmed by the changes. And just as likely, want to spend their time figuring out a new get rich off nonsense scheme.
-
Thank you for the advice Alan. Maybe we can hope that over time all those sites get deindexed and so the links disappear on their own, because I'm finding it impossible to contact webmasters / they don't seem happy to help us out.
-
Play with fire, get burned. Yes, you already know that. No, I don't think you personally should have to suffer through untold similar cliche's.
So here's the reality. Without cleaning up the profile, it's highly unlikely the site will ever recover. That leaves the only other reasonable option, which is the drastic one. Abandon the existing domain as far as SEO goes and start fresh with the clean domain.
That's potentially going to be the biggest challenge to get others to agree to, because those who have the guts to play with fire in a known dangerous environment typically have too big an ego to admit there isn't yet another quick and easy fix that instantly reaps big rewards on the scale that should never have been achieved previously in the first place.
However even Matt Cutts said this past week, that in worst case scenarios, people just may need to start with a new site. When Matt comes out and says that, you can be sure the potential for all hope to be gone on a now burned site to rebound is now lower than it ever was.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Vetting Link Opportunties that are Penguin Safe
I am looking to go after sites that are, and will never be, affected by Penguin/Panda updates. Is there a tool or a general rule of thumb on how to avoid such sites? Is there a method anyone is currently using to get good natural links post Penguin 2.0?
White Hat / Black Hat SEO | | dsinger0 -
Panda Recovery: Is a reconsideration request necessary?
Hi everyone, I run a 12-year old travel site that primarily publishes hotel reviews and blog posts about ways to save when traveling in Europe. We have a domain authority of 65 and lots of high quality links from major news websites (NYT, USA Today, NPR, etc.). We always ranked well for competitive searches like "cheap hotels in Paris," etc., for many, many years (like 10 years). Things started falling two years ago (April 2011)--I thought it was just normal algorithmic changes, and that our pages were being devalued (and perhaps, it was). So, we continued to bulk up our reviews and other key pages, only to see things continue to slide. About a month ago I lined up all of our inbound search traffic from Google Analytics and compared it to SEO Moz's timeline of Google updates. Turns out every time there was a Panda roll-out (from the second one in April 2011) our traffic tumbled. Other updates (Penguin, etc.) didn't seem to make a difference. But why should our content that we invest so much in take a hit from Panda? It wasn't "thin." But thin content existed elsewhere on our site: We had a flights section with 40,000 pages of thin content, cranked out of our database with virtually no unique content. We had launched that section in 2008, and it had never been an issue (and had mostly been ignored), but now, I believed, it was working against us. My understanding is that any thin content can actually work against the entire site's rankings. In summary: We had 40,000 thin flights pages, 2,500 blog posts (rich content), and about 2,500 hotel-related pages (rich and well researched "expert" content). So, two weeks ago we dropped almost the entire flights section. We kept about 400 pages (of the 40,000) with researched, unique and well-written information, and we 410'd the rest. Following the advice of so many others on these boards, we put the "thin" flights pages in their own sitemap so we could watch their index number fall in Webmaster tools. And we watched (with some eagerness and trepidation) as the error count shot up. Google has found about half of them at this point. Last week I submitted a "reconsideration request" to Google's spam team. I wasn't sure if this was necessary (as the whole point of dropping the pages, 410'ing and so forth was to fix it on our end, which would hopefully filter down through the SERPs eventually). However, I thought it was worth sending them a note explaining the actions we had taken, just in case. Today I received a response from them. It includes: "We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages..." And thus, I'm a bit confused. If they say that there wasn't any manual action taken, is that a bad thing for my site? Or is it just saying that my site wasn't experiencing a manual penalty, however Panda perhaps still penalized us (through a drop in rankings) -- and Panda isn't considered "manual." Could the 410'ing of 40,000 thin pages actually raise some red flags? And finally, how long do these issues usually take to clear up? Pardon the very long question and thanks for any insights. I really appreciate the advice offered in these forums.
White Hat / Black Hat SEO | | TomNYC0 -
My Penguin Recovery Attempt
So I have decided today to attempt to beat the odds and try and do a full recovery from the Penguin Update. I am going to create a Google Doc in which I will make public and link at the end of this post for all of you to see. I am going to meticulously go through a massive back link audit of my site and try to see if I can recover from my loss of some of my main keywords back from April 24th. I want to clarify that I DID NOT get effected from Penguin 2.0, but I did from 1.0 and have not recovered since. I have to be honest I feel like I have done everything up to now, but I realized I needed to make this a very long journey into a massive audit into my back link profile which contains thousands of back links which I have been honestly avoiding. I just want to see if I put the work, I will see the results and maybe it can help others I will document everything I do in detail as well as dates when I do them. I'm sure there will be plenty of coffee fueled nights of Jibber-Jabber...and I apologize for that ahead of time. I hope at the end there is light and can shed some on others. I am starting with a blank canvas, so keep checking back on my progress. I generally work at night so you will see most changes in the morning. Here is the link to my Doc - http://bit.ly/11dUkzc Wish Me Luck
White Hat / Black Hat SEO | | cbielich1 -
Speculations about current and future status of Panda
I would like to discuss how you think Panda is currently affecting the Google index, and if actually the so much discussed "Panda Update" is still an hot topic in the SEO world. We got the last official manual update back on March 14, and at that time Matt Cutts said that Panda would have been integrated in the regular algorithm. Fact is: since then I haven't heard about Panda anymore, despite my main e-commerce website as well as many others, is still under a strong Panda penalty. I have worked hard in the past 2 months to cleanup my site, removing thin and duplicate content. But so far I haven't gotten any positive signs from Google. Can we say that Panda is now officially integrated in the algorithm? Do we have any signs on that? If so, why we can't see any improvements on our sites, well cleaned-up? Thoughts? Speculations? I am eager to know your thoughts about this very sensitive issue that looks like has been forgotten a little bit in the past few weeks. Thanks!
White Hat / Black Hat SEO | | fablau0 -
I think I've been hit by Penguing - Strategy Discusson
Hi, I have a network of 50 to 60 domain names which have duplicated content and whose domains are basically a geographical location + the industry I am in. All of these websites have links to my main site. Over the weekend I saw my traffic fall. I attribute our drop in rankings to what people are calling Penguing 1.1. I want to keep my other domains as we are slowly creating unique content for each of those sites. However, in the mean time, clearly I need to deal with the inbound linking and anchor text problem. Would adding a nofollow tag to all links that point to my main site resolve my issue with Google's penguin update? Thanks for the help.
White Hat / Black Hat SEO | | MangoMan160 -
Google Penguin w/ Meta Keywords
It's getting really hard filtering through the Penguin articles flying around right now so excuse me if this has been addressed: I know that Google no longer uses the meta keywords as indicators (VERY old news). But I'm just wondering if they are starting to look at them as a bigger spam indicator since Penguin is looking at over-optimization. If yes, has anyone read good article indicating so? The reason I ask is because I have two websites, one is authoritative and the other… not so much. Recently my authoritative website has taken a dip in rankings, a significant dip. The non-authoritative one has increased in rankings… by a lot. Now, the authoritative website pages that use meta-keywords seem to be the ones that are having issues… so it really has me wondering. Both websites compete with each other and are fairly similar in their offerings. I should also mention that the meta-keywords were implemented a long time ago… before I took over the account. Also important to note, I never purchase links and never practice any spammy techniques. I am as white hat as it gets which has me really puzzled as to why one site dropped drastically.
White Hat / Black Hat SEO | | BeTheBoss0 -
Penguin Update and Infographic Link Bait
Is it still ok to use infographics for link bait now that the penguin update has rolled out? Are there any techniques that should be avoided when promoting an infographic? Thanks
White Hat / Black Hat SEO | | eddiejsd1 -
Publishing Press Releases after Google Panda 2.5
For the past few years I have been publish press releases on my site for a number of business. I have high traffic on my site. I noticed that with the Google Panda 2.5 update PRNewswire.com dropped visibility by 83%. Should I stay away from publishing press releases now? Does Google consider Press Releases to be "content scraping" since multiple sources are publishing the release?
White Hat / Black Hat SEO | | BeTheBoss2