What's the best way for SEO newbie to analyze & fix a site after being hit by Panda?
-
Hi,
I have a prospective client who was in the top 3 on Google for two of their primary keywords. They fell way back in the rankings immediately after Panda was rolled out on September 27, 2012. Two weeks ago, they were at position #118 for one keyword. After looking for them in Rank Checker today, they cannot be found in Google at all.
Here's my question. Because of the "bad links" (some pointing to Porno sites)... what's the possibility that this situation cannot be fixed?
I don't know... maybe I'm asking an irrelevant question. I'm attempting to assess the situation so I can go back and present my findings to the prospective client. I'm committed to understanding what's going on with their website so I can assess the situation properly.
Fixing their problem starts with a correct assessment.
They have a ranking problem, and I know I can fix that... IF all their site needs is white hat <acronym title="Search Engine Optimization">SEO</acronym>.
What I DON'T want is... to sell them <acronym title="Search Engine Optimization">SEO</acronym> services, only to find out in 3-6 months... I made an incorrect diagnosis of the problem, and therefore sold them the wrong solution. I know I can close the sale if I can show them with reasonable substantiation that the damage is not beyond repair.
I'm familiar with the basics of <acronym title="Search Engine Optimization">SEO</acronym>, but I'm unfamiliar with how "bad linking" might effect the long-term commitment to optimizing. They're wondering if they should start over on another website. I was attempting to do an assessment to better understand if my typical approach on this site would be sufficient. Also... I wanted to get an assessment/report to show them something to substantiate my conclusion(s) about their website.
If Open Site Explorer is sufficient to do the link analysis... great. At least I know I'm working with the right tool. All I have to do is learn how to use the tool quickly. At this point... I'm not sure which tool would be helpful.
So... can you speak to the following 2 questions:
1.) How do you know when ranking problem is beyond fixing?
2.) What software/tool is ideal for doing some link analysis in order to assess the problem, and prescribe a solution?
Thanks so much!
Ramon
-
Thank you for your reply.
Ramon
-
Thanks Dave... your post is very helpful.
Ramon
-
Great point/question Chris. I was referred to them by a CPA whom I helped with his SEO. He had a most basic issue, and began to see immediate improvements. It made the introduction a no-brainer for him.
Chris... I appreciate your candor.
Ramon
-
Your first step is to understand ... "What is Panda?". "What is Penguin?"
Go learn that, be able to explain it exceptionally well, then come back here and ask questions.
You need to understand the basics before you can fix another person's website.
-
I'll tell you, I have to wonder why a company with such a major problem is requesting that a "newbie" fix it for them. Nothing against you, Ramon, but my first thought is that they're hoping to get someone to spend a bunch of time on it for cheap, which could have been what got them in trouble in the first place.
If they're coming to you for advice and they know you're new to SEO, be up front with them and say your best advise is that they move all the content (minus any low quality outbound links) to a new domain and start over. I wouldn't even redirect the old site to the new one. Unless they're going to pay you well for all the time your going to take learning everything you need to learn to make a serious effort at solving a difficult problem such as this, cut to the chase and see what they say.
If they go for that, then you can spend your time fixing the problems you know you can fix and help them build a better link profile. That's my two cents.
-
Hi Ramon,
This is a very difficult question to answer. Before I even try to answer it I'd like to point out the following:
The first Panda filter came out on Feb 25, 2011. On Feb 24, 2012 Search Engine Round Table published a survey asking how many webmasters recovered from the filter one year later. The survey can be found here. As you can see, not many were able to recover from the filter even one year later. This is by no means a conclusive study, but it's really the best we have.
To answer your questions:
1. In my opinion, it's beyond fixing if you lose rankings and you did NOT receive a warning from Google. If you received a warning consider yourself lucky. Follow Google's instructions and use a service like linkdelete.com to get the bad backlinks removed (or just do it yourself).
2. Open Site Explorer works great for identifying bad backlinks. Go through the list of exact match keyword anchor texts and you'll be off to a good start.
Finally, I'd like to plug my post on this subject:
How-To Recover From Google Penalties & Filters
It's a long one, but I include several pictures and examples to make this process easier for folks in your situation.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to remove broken links from our wordpress site?
Hello! How are you? We just signed up to Moz.com. Moz link tool. It gave us many broken links with 404's and 302's. Could you please help me with deleting the links? Thanks!
Moz Pro | | hsma0 -
Site Crawl Error
In moz crawling error this message is appears: MOST COMMON ISSUES 1Search Engine Blocked by robots.txt Error Code 612: Error response for robots.txt i asked help staff but they crawled again and nothing changed. there's only robots.XML (not TXT) in root of my webpage it contains: User-agent: *
Moz Pro | | nopsts
Allow: /
Allow: /sitemap.htm anyone please help me? thank you0 -
How can I get a report of the top 500 national sites using Domain Authority and Page Authority? Does someone have to type sites in one by one on Site Explorer or is there another way?
I want to know what sites and blogs have the to rankings overall based on their Domain Authority and Page Authority - Top 500 - 1000 in nation - I want to know which ones are follow and no-follow too - Does anyone know? Has anyone run such a report yet? Thanks for help - BD
Moz Pro | | creativeguy0 -
How to track sub-directories as separate sites
I am tracking a blog for a client thats hosted on a subdomain, we decided to give it it's own SEOMoz campaign. My issue is that when entering competitors, if the competing blog is using a sub-directory, SEO Moz won't let me enter it, only the domain. I would assume that there has to be a way to do this, but I don't see it. I need to be able to track the competing blogs as separate from their root domain. mysite: blog.mysite.com their site: theirsite.com/blog (very different from just theirsite.com) Hopefully someone can tell me what to do here because the metrics are basically useless if I am forced to use only the domain. Thanks so much!
Moz Pro | | Ascedia0 -
How to recover from penguin hit for my site http://rndegrees.net
My site http://rndegrees.net hit by penguin and after that I worked on cloud tag and one duplicate page content which was present on my site, also I worked on back links though my site hasn't recovered yet. Please guide me if I missed on something to recover from penguin hit.
Moz Pro | | HQP0 -
OSE Domains & Subdomains
Is there a compelling reason that OSE treats subdomains as part of a parent domain, rather than as a separate site? Or is this just a technical limitation? I ask because it's my understanding that Google treats subdomains more like separate domains than like parts of the parent domain. OSE treating them more like folders creates some frustrating situations when researching niches that are heavily filled with blospot and wordpress.com blogs. First of all, the domain authority in these situations is not at all indicative of the strength of the site. It also makes it hard to evaluate linking root domains at a glance, since all blogspot blogs count as one domain. So to see all blogspot sites linking you have to go to the full link list -- where each site may be listed hundreds of times -- and you can't group them because they're all considered the same domain. To be sure you when researching these niches you can just throw out domain authority as a metric, and export every report to excel where you can sort things in a way to make it easier to separate sites. But if there isn't a compelling SEO reason to have OSE function this way, I'd love to see those subdomains treated as separate sites so I can have access to all the easy to use SEOmoz metrics and layouts without the extra work. And of course if there is a compelling SEO reason for subdomains to be treated as domains, I'd love to be educated! : )
Moz Pro | | Ecreativeworks0 -
Current on-page best practices
Given all the recent talk about over optimization, when was the last time SEOMoz updated the on-page report card tool? Rand wrote an excellent piece on Perfect On-Page Optimization (which is great, and thanks) in summer 2009. Is that still best practice 3 years later (and post-Penguin/Panda)? If not, has the SEOMoz on-page report card tool been updated to reflect current thinking for on-page best-practices? I know the higher level concept is "write for humans, not for bots" but if you can do both (and not create an unreadable seo-frankenpage) then why not? Does getting an "A" grade wreak of over optimization now? Should I use the key phrase at the start of the title, h1, and strong (or bold) elements on a page? Should have an image with file name and alt text equal to (or containing at the start) the key phrase?
Moz Pro | | scanlin2 -
Open Site Explorer Question!
Hi, I have performed a search on a root domain and the page auth is higher then the domain auth? I would have thought they would have been the same or at least the other way around!
Moz Pro | | activitysuper0