Manual Penalty Removed - Recovery Times...
-
Howdy Mozzers,
For anyone who has had experience of a manual penalty i'd appreciate your feedback.
How long did it take to recover from a Manual Penalty?
Of course every situation is different and its only been 8 days so perhaps it's to soon.
Below is the email we received, I highlighted "believed" they didn't state we had. We highlighted a bunch of back links we didn't like however most of these remain in our profile in GWT so not sure what was really the problem.
"Previously the webspam team had taken manual action on your site because we believed it violated our quality guidelines. After reviewing your reconsideration request, we have revoked this manual action. It may take some time before our indexing and ranking systems are updated to reflect the new status of your site."
Your feedback would be greatly appreciated.
-
That was the original email - Google has detected evidence of unnatural link building.
We're still not entirely sure what caused the issue. I assume it was in our last reconsideration request but at that point we were just hoping.
-
Unfortunately there is no warning given at all for sites hit by Penguin. Hopefully yours is just a manual unnatural links penalty!
-
Hi Marie,
Thanks for the feedback. Hopefully it won't be months as we have already been suffering since May.
If there are underlying algorithmic issues related to Penguin...we would have a notice right?
Thanks for the feedback, greatly appreciated.
-
Forehead slapping moment, I know of this. Just for some reason hadn't thought to give it a go!
Thanks you. I'll keep you posted on the outcome.
Ps. if you don't want more people to know, perhaps take it off the public domain! Ha
-
I've seen some sites come back within a day or two of a manual penalty being removed. But, for others it can take a couple of weeks or even a few months. The difference is the degree of "spamminess" that you had before the penalty. If you had some really nasty blackhat stuff going on then you're more likely to have a longer penalty.
However, as others have mentioned, you could also have a Penguin issue on top of the warning in which case you won't get the full extent of your recovery until another Penguin refresh runs.
-
Hello Robert,
I will give you a method here that may assist you on the reindex side:
In GWMT go to Health> Fetch as Googlebot
In the url space following the initial domain, put an offending page and then click Fetch
It usually takes less than 5 seconds and you will see "success" with a green check mark followed by a button for Submit to Index. (Given the situation, I highly urge you to click it!
Note: If the entire site is affected, you will want to use URL and Linked pages on submission (you only get ten of these per month) and if not, just that url. (You get 500 Fetches a month).
This should help get you indexed more rapidly. PLEASE do not tell anyone as it is a secret!!
Good Luck,
Robert
-
Thanks for the 2c.
Keeping my eyes peeled, fingers crossed and working on the good stuff!
-
Great Video!
As you say, if the manual penalty removal doesn't allow traffic bounce back then I will be looking at any penguin related issues that I need to work on.
At the moment our big focus is on creating high quality unique content for high quality sites. Whilst focussing on all the on page/on site content in our control to make sure everything is up to scratch.
I'll keep you posted.
-
You just have to be patient here, I have seen sites bounce back in days and others take weeks. If the penalty is revoked, it should bounce back though so just try to hold fire.
The one caveat here is that there have been a whole bunch of updates and I am working with a few sites that seemingly have a manual penalty along with a bunch of panda and penguin issues. We have resolved some of the panda issues and made some improvements, we can still see a bunch of penguin type issues that we are waiting for a data refresh on and we are waiting for a response on the manual thing.
Point being, that you may not bounce back to where you were and it's possible you may have snowballed a few more problems along the way.
So, keep an eye on analytics, wait too see if you have a bounce back and then dig into your site to see if you have penguin or panda issues holding you back as well.
This is worth a quick watch:
http://www.youtube.com/watch?v=ES01L4xjSXE&feature=player_embedded
Marcus
-
Great news!
As they said, give it a month or two and you'll start seeing your rankings improve.
They probably apply these penalties in bulk and then manually review when reconsideration requests come through. Whoever analysed your profile saw nothing majorly wrong so lifted the ban.
My 2c worth.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does .me takes more time to rank than .com?
Hi, our company website is about freight forwarding, and im feared about .me extension they have taken. The location is for Dubai and the website is running google ads with a no-indexed landing page. I have the doubt, that our cargo company based website shipwaves.me is not receiving Google ads attention in that case. Besides, the other confusion is shipwaves.me takes time to rank for the keywords with high search or not other than .com extension. I'm confused why this company has taken, .me extension and anybody got the idea- is this .me is a top-level domain or it takes more time than .com domains.
White Hat / Black Hat SEO | | LayaPaul0 -
Advise / Help on Bad Link Removals
Hey everyone.
White Hat / Black Hat SEO | | TheITOteam
Im new to the community and new to backlinks - hence the question to the community today.
I would like help understanding options and work load around back links and removing them.
I have a client with over 8000 back links as a few years ago he paid someone about £10 to boost his rankings by adding thousands of backlinks.
We fear this is having a bad effect on their site and rankings organically as 90% of these back links have a spam score of over 50% and also no follows. My questions to the community (if you could be so kind to share) are:
1. Whats the best way to decide if a Backlink is worth keeping or removing
2. Is there a tool to decide this or assist with this somewhere on the internet? Ive had advise stating if its not hurting the page we should keep it. However, again...
How do I know what damage each Backlink is causing to the domain? I appriciate anyones time to offer some advice to a novice looking to clear these1 -
Penalty
Hi, I wonder if a subdomain gets penalty from Google. Is there any risk that the domain gets penalty?
White Hat / Black Hat SEO | | Rob_I1 -
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Which SEO companies offer Penalty analysis?
I'm having a hard time finding a (good) SEO company which specializes itself in Penalty analysis? Any recommendations? I only found Bruce Clay, but they charge 8,000$ :)...
White Hat / Black Hat SEO | | wellnesswooz0 -
Penguin link removal what would you do?
Hi Over the last 4 months I have been trying to remove as many poor quality links as possible in the hope this will help us recover. I have come across some site's that the page our back-link is on has been de-indexed, goggle shows this when I look at the cached page... 404. <ins>That’s an error.</ins> The requested URL /search?sourceid=navclient&ie=UTF-8&rlz=1T4GGNI_enGB482GB482&q=cache:http%3A%2F%2Fforom.eovirtual.com%2Fviewtopic.php%3Ff%3D4%26t%3D84 was not found on this server. <ins>That’s all we know.</ins> If goggle is showing this message do I have to still try to remove the link, or is it a case goggle has already dismissed the link?
White Hat / Black Hat SEO | | wcuk0 -
How do you remove unwanted links, built by your previous SEO company?
We dropped significantly (from page 1 for 4 keywords...to ranking over 75 for all) after the Penguin update. I understand trustworthy content and links (along with site structure) are the big reasons for staying strong through the update...and those sites that did these things wrong were penalized. In efforts to gain Google's trust again, we are checking into our site structure and making sure to produce fresh and relevant content on our site and social media channels on a weekly basis. But how do we remove links that were built by our SEO company, those of which could be untrustworthy/irrelevant sites with low site rankings? Try to email the webmaster of that site (using data from Open Site Explorer)?
White Hat / Black Hat SEO | | clairerichards0