Penguin Rescue! A lead has been hit and I need to save them!
-
I had a meeting today with a prospective client who has been hit by Penguin. Their previous SEO company has obviously used some questionable techniques which is great for me, bad for the client. Their leads have dropped from 10 per day to 1 or 2. Their analytics shows a drop after the 25th, a back link check shows a lot of low quality links. Domain metrics are pretty good and they are still ranking ok for some keywords. I have 1 month to turn it around for them. How do you wise people think it can be done? First of all I will check the on-site optimisation. I will ensure that the site isn't over optimised. Secondly, do I try and remove the bad links? Or just hit the site with good content and good links to outweigh the bad ones. Also, do you think G is actually dropping rankings for the over optimisation / bad links or are the links are just being discredited rsulting in the drop in rankings. 2 very different things. Any advice is appreciated. Thanks
-
This sounds like a plan. Give it a shot and test the results
-
Does anyone care to share their view on my last?
I have ran backlink checks and they have a site wide footer links from 2 of their other businesses. This has created thousands of backlinks with the exact same anchor text. Do you think this could cause a problem?
I'm thinking of reducing it to just 2 links each from the 2 sites.
Other than that the backlink make up looks pretty normal except for the repeated anchor texts.
-
Thanks for all the responses guys. I have taken them on-board. 1 thing I have noticed...
I have ran backlink checks and they have a site wide footer links from 2 of their other businesses. This has created thousands of backlinks with the exact same anchor text. Do you think this could cause a problem?
I'm thinking of reducing it to just 2 links each from the 2 sites.
Other than that the backlink make up looks pretty normal except for the repeated anchor texts.
Thanks
-
I second the time frame issue. 1 month won't be enough time and your work will just benefit the next person this client gets to work on it, while you'll be left with an upset client because of bad expectations.
-
"you need to fix whatever issues are there, wait for the algorithm to process again, and then if you've solved the issues, you should theoretically restore the rankings. That's much easier said than done. You don't know exactly what the issues are, and we don't know when the algo will process again."
I agree with this 100%.
These types of problems can be fixed and then must wait until google reevaluates and then republishes back into the SERPs. Sites that are hit with these types of problems escape in batches - not when things are fixed.
So, you could do great work, get it fixed on 25th day and then google does not reprocess and republish for 60 more days and some other SEO gets credit for your hard work.
I don't think pointing good links into the site will get rid of the issue with the problematic links and clear you of the algo.
Exactly... What are good links? Your "added" links will not be natural.
-
Well, from what everyone is writing about Penguin, it's an algorithmic update. Meaning you need to fix whatever issues are there, wait for the algorithm to process again, and then if you've solved the issues, you should theoretically restore the rankings. That's much easier said than done. You don't know exactly what the issues are, and we don't know when the algo will process again.
I think the timeline you have set is highly unrealistic and you should aim to set expectations with the client that this process can very well take much longer. If this previous SEO company built problematic links, I think you'll have to deal with them. I don't think pointing good links into the site will get rid of the issue with the problematic links and clear you of the algo. I think you're going to have to go through the tedious work of cleaning things up. The good news is that a bunch of people have written about what to look for. Check in WMT tools for sitewide links, check your anchor text pointing into the site. Export your external links from OSE and then upload them to Linkl Detective- http://linkdetective.com/- let it do the hard work for you, classify a lot of the links, and then you need to go through the process of trying to clean things up, doing as much as you can, and then submit a reinclusion request (may help, may not), hoping Google will discard the other links.
Good luck - really try to demonstrate to your client the complexity of the process and extend the timeframe of the project - that's my ultimate recommendation
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Self referencing canonicals and paginated content - advice needed
Hi, I help manage a large site that uses a lot of params for tracking, testing and to help deal with paginated content e.g. abc.com/productreview?page=2. The paginated review content correctly uses rel next and rel prev tags to ensure we get the value of all of the paginated review content that we have. The volume of param exclusions I need to maintain in Google & Bing Webmaster tools is getting clunky and frustrating. I would like to use self referencing canonicals, which would make life a lot easier. Here's my issue: If I use canonicals on the review pages the paginated content urls would also use the same canonical e.g. /productreview?page=2 pointing to /productreview I believe I am going to lose the value of those reviews, even though they use the rel next rel prev tags. BTW airbnb do this - do they know something I don't, don't care about the paginated reviews, or are they doing it incorrectly, see http://d.pr/i/14mPU Is my assertion above correct about losing the value of the paginated reviews if I use self referencing canonicals? Any thoughts on a solution to clearing up the param problem or do I have to live with it? Thanks in advance, Andy
Intermediate & Advanced SEO | | AndyMacLean0 -
Need help with Robots.txt
An eCommerce site built with Modx CMS. I found lots of auto generated duplicate page issue on that site. Now I need to disallow some pages from that category. Here is the actual product page url looks like
Intermediate & Advanced SEO | | Nahid
product_listing.php?cat=6857 And here is the auto generated url structure
product_listing.php?cat=6857&cPath=dropship&size=19 Can any one suggest how to disallow this specific category through robots.txt. I am not so familiar with Modx and this kind of link structure. Your help will be appreciated. Thanks1 -
We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301s?
We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301 redirects or will search engines know to remove the old 404 pages from the SERPs? We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301s?
Intermediate & Advanced SEO | | GrandOptimizations0 -
New g(TLD) advice needed
Hey all, I'm a bit confused by conflicting advice, need some direct input. We're quite experienced in SEO but that doesn't mean we can't get better 🙂 I manage a very old, well established, very generic TLD portal that ranks very highly in MANY keywords. (If you know our domain, I'd appreciate not naming it here) (145 1-3 ranks, 342 1-20 ranks) but there are also many topics we want to improve upon. Lets say, for example, I own gold.com, but I've failed to rank for 'gold events' and I acquired gold.events. What is the thought as to using some of the g(TLD)s versus the original .com? In the example events.gold.com or gold.events or gold.com/events/? I really can't find a consensus on which would bemost effective for SEO purposes. In a more general aspect of the same question, we own MANY "gold.newg(TLD)" domains and are conflicted as to best use of all of them. All advice greatly appreciated. Nat
Intermediate & Advanced SEO | | WorldWideWebLabs0 -
Have you guys seen this yet: panguintool.com to help identify what hit the ranks
Have you guys seen this yet: panguintool.com Works with GA to show traffic superimposed with Panda/Penguin/other
Intermediate & Advanced SEO | | irvingw
updates to help discover what hit the ranks. Looks interesting, requests access to your GA account.0 -
Hit by Google updates; Some good advice needed
Hi, Here`s my domain http://www.kent-website-designer.co.uk/. Registered in 2007. We have took a big hit from the updates in the last 6 months and its really affecting revenue. I know when you look at the site you may well think WOW this is 2007 SEO and youre right it hasnt been updated in some time as of last year we ranked very highly and it gave us enough business to concentrate on. However up until last year many of my competitors were using same onpage and offpage strategies....and probably a few of you were too! So now the inquiries and income is drying up. However, I provided myself with an income from my efforts, rather than be unemployed, so I want to get it back on track. I visited the Google webmaster forums to query a couple of webmaster account queries and basically got beat up by the rude and arrogant google forum admins. Basically they said I was a spam site who shouldnt be in business. How very nice! 1. I have EMD - but domain age should mean something? 2. I lost a few links from https://www.getsafeonline.org/partners-and-supporters/ in the last year which hasnt helped when they reorganised their content. Same with other trusted sites we lost links. We are left with low quality links. 3. Some CMS sites have replicated our footer links on a large scale, which wasnt intentional but may look as link spam, plus they arent no followed as G prefers. 4. Google seems to have become intelligent? Apparently it can detect content which is negative in outdated seo advice. How, can it understand context and meaning so older seo advice isdetected as spam content? 5. No pages are de indexed just a rank drop to 30 - 60 positions. 6. Over optimised H1`s? 7. Is Pipe command in titles now negative? So its sink or swim time I guess. The siteand domain is honest but neglected and probably should re align the business with what we can offer. We got away with that SEO but clearly things have changed. However with no grey or black hat at least we arent overly worried by removing links. Also looking for an SEO company who we can outsource with a white label solution in order to offer SEO. I dont need beating up, short and to the point critiques please. Pros and Cons Many thanks.
Intermediate & Advanced SEO | | xtopher661 -
Penguin Penalty?
The past 2 days, specific keywords Ive been ranking well for have disappeared. If I google specific with brand it still shows up. So I havent been removed from the index. Is it possible that I was hit by penguin without any type of notice in the webmaster account? Organic traffic dropped substantialy in the past couple days without any warnings. Any help greatly appreciated! Thank You
Intermediate & Advanced SEO | | TP_Marketing0 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0