Google Manual Penalty Lifted - Why is my website still decreasing on traffic?
-
Hi there,
I was hoping that somebody has a potential answer to this or if anyone else has experienced this issue.
Our website has recently hit by a manual penalty (structured data wasn't matching the content on the page)
After working hard on this to fix the issue across the site, we submitted a reconsideration request which was approved by Google a few days later.
I understand that not all websites recover and it doesn't guarantee rankings will go back to normal, but it seems as if the traffic is continuing to drop at an even quicker rate.
There's a number of small technical optimisations that have been briefed into the dev team such as:
- Redirecting duplicate versions, fixing redirects on internal links,
There's also work on-page running in the background fixing up keyword cannibalization, consolidating content keyword mapping and ensuring the internal link structure is sound.
Has this happened to anyone else before? If so, how did you recover?
Any suggestions/advice would be really appreciated.
Thank you
-
I have seen things like this happen before, but they're usually associated with a links penalty rather than a rich snippet spam penalty. When Google remove the authority pipelines for bad links, they don't magically decide to start valuing those linking sites again due to a reconsideration request (so in that area, it's common for people to get into an awful mess of unrealistic expectations)
With rich snippet spam penalties, I have seen some pretty savage ones but usually they are more of an on / off scenario. To see the kind of continual decline which you say you are experiencing, is quite unusual
Technical factors can influence ranking results, but they tend to influence indexation more than they influence rankings (e.g: making URLs which were previously hard to discover, easier for Google to discover, so new ranking positions can be created). Technical changes are (usually, there are exceptions) less good at pushing up existing rankings (which is more the domain of content, awesomeness and link-worthiness)
"- Redirecting duplicate versions, fixing redirects on internal links"
Something that can be done with the best of intentions, yet which can often be done wrong. For example, maybe you own a site and you notice that both of these URLs are accessible (200/OK):
One has a trailing slash, the other does not. So you say to yourself, okay what we'll do is redirect one structure to the other! Seems logical right? But what if one of your structures (non-trailling slash) was more commonly linked to than the other (forced trailing slash)? When you make your change, suddenly most of your most important backlinks are hitting 301 redirects, instead of hitting your landing pages directly. In this hypothetical example, if you had picked the alternate structure (removing the trailing slash from URLs instead of forcing it) then the site may have performed much better. This is just a hypothetical illustration, but it shows that - simple ideas are never simple! In SEO we get paid for our analytical skills because they do matter and people need analysis pieces before making sweeping decisions, without realising the potential ramifications
"There's also work on-page running in the background fixing up keyword cannibalization, consolidating content keyword mapping and ensuring the internal link structure is sound"
Again, you may be shooting yourself in the foot in the short term. I am referring to what you term as "consolidating content" which usually revolves in reducing the number of pages on your website and funneling some content together, into fewer, more in-depth URLs which you hope will rank better. Totally the right thing to do in the long term, but in SEO, many strategies which yield long-term gains also cause disruption which causes short-term tail-off. If you JUST pulled yourself out of a penalty, was it really the right time to 'get disruptive'? I'd say no, it was not
If you are consolidating content, Google may or may not rank your single new page as well (for different keywords) as the two or more pages which were funneled into the creation of your new page. Why? Well, from a technical POV, even when you deploy the mighty 301 redirect, it doesn't always transfer 100% SEO authority from the old URL(s) to the new URL
Google tend to run similarity checks over their last active cache of the old URL(s), against the new page which you have supplied. If they seem % dissimilar, then that % of SEO authority is removed from the equity transfer of the 301 redirect. By similar, I mean something akin to, taking all the content from both (old vs new) page variants and running something like a simplified Boolean string similarity test. I don't mean what humans think is similar, I don't mean what you think is similar. I mean - what a mechanical mind would think was similar / dissimilar (often very different)
If Google didn't run such checks, you could easily by up authoritative expired domains, redirect them to yourself and gain loads of SEO authority for nothing. So Google wants to be sure, is THIS content which is receiving this 301 redirect - the SAME content which earned those backlinks? Might the webmaster who linked to that old URL, decide not to link to this new page? If there's much risk of that, even the mighty 301 redirect gets nuked in terms of equity transfer
Your hope of course, is that your new URL will be so much better than the old one(s), that over time it will earn more links than they did. If you are lucky, some authority from the old page(s) will filter through, but you should certainly expect some degree of short-term tail-off. If you have done this just as you have escaped a penalty, I can see how the convergence of your technical disruption(s) and the late penalty, could be causing you significant issues
Instead of doing types of work which remove URLs from your site, remove pages which could be indexing and narrow your content - I'd be doing EXACTLY the opposite. Creating new pages and content which is connected with new (yet relevant) keywords. Maybe work on the top or middle of your keyword (buying) funnel a bit. Get some digital (editorial) PR going, get some more authority and new pages which could be ranking in Google's SERPs. If you think about it, performing purely reductive work after you have had a massive traffic reduction, really isn't going to serve you very well
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301s?
We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301 redirects or will search engines know to remove the old 404 pages from the SERPs? We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301s?
Intermediate & Advanced SEO | | GrandOptimizations0 -
"Null" appearing as top keyword in "Content Keywords" under Google index in Google Search Console
Hi, "Null" is appearing as top keyword in Google search console > Google Index > Content Keywords for our site http://goo.gl/cKaQ4K . We do not use "null" as keyword on site. We are not able to find why Google is treating "null" as a keyword for our site. Is anyone facing such issue. Thanks & Regards
Intermediate & Advanced SEO | | vivekrathore0 -
Website with only a portion being 'mobile friendly' -- what to tell Google?
I have a website for desktop that does a lot of things, and have converted part of it do show pages in a mobile friendly format based on the users device. Not responsive design, but actual diff code with different formatting by mobile vs desktop--but each still share the same page url name. Google allows this approach. The mobile-friendly part of the site is not as extensive as desktop, so there are pages that apply to the desktop but not for mobile. So the functionality is limited some for mobile devices, and therefore some pages should only be indexed for desktop users. How should that page be handled for Google crawlers? If it is given a 404 not found for their mobile bot will Google properly still crawl it for the desktop, or will Google see that the url was flagged as 'not found' and not crawl it for the desktop? I asked a similar question yest, but it was not stated clearly. Thanks,Ted
Intermediate & Advanced SEO | | friendoffood0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
Why is page still indexing?
Hi all, I have a few pages that - despite having a robots meta tag and no follow, no index, they are showing up in Google SERPs. In troubleshooting this with my team, it was brought up that another page could be linking to these pages and causing this. Is that plausible? How could I confirm that? Thanks,
Intermediate & Advanced SEO | | SSFCU
Sarah0 -
Loss of rankings due to hack. No manual penalty. Please advise.
I have a client who's site was hacked. The hack added a fake directory to the site, and generated thousands of links to a page that no longer exists. We fixed the hack and the site is fully protected. We disavowed all the malicious/fake links, but the rankings fell off a cliff (they lost top 50 Google rankings for most of their targeted terms). There is no manual penalty set, but it has been 6 weeks and their rankings have not returned. In webmaster tools, their priority #1 "Not found" page is the fake page that no longer exists. Is there anything else we can do? We are out of answers and the rankings haven't even come back at all. Any advise would be helpful. Thanks!
Intermediate & Advanced SEO | | digitalimpulse0 -
Same website, seperate subfolders or separete websites? 12 stores in two cities
I have a situation where there are 12 stores in separate suburbs across two cities. Currently the chain store has one eCommerce website. So I could keep the one website with all the attendant link building benefits of one domain. I would keep a separate webpage for each store with address details to assist with some Local SEO. But (1) each store has slightly different inventory and (2) I would like to garner the (Local) SEO benefits of being in a searchers suburb. So I'm wondering if I should go down the subfolder route with each store having its own eCommerce store and blog eg example.com/suburb? This is sort of what Apple does (albeit with countries) and is used as a best practice for international SEO (according to a moz seminar I watched awhile back). Or I could go down the separate eCommerce website domain track? However I feel that is too much effort for not much extra return. Any thoughts? Thanks, Bruce.
Intermediate & Advanced SEO | | BruceMcG0 -
Website Ranking Issue
Hey All My question is specfic to a particular website. The category of the website is Kitchen Appliances. The keyword is extremely competitive. The website I am currently optimizing has loads of products and many pages as well. I am constantly building links from industry specific websites for the website as well as composing articles and leading the users back to the website with keyword rich anchor text. I have been doing this for around 3 months and I do not see the website in the first 30 pages of the SERP (for the keyword kitchen appliances - the site is a page rank 2 BTW). No bugs reported as well in Webmaster tools. My next step is to add these articles to the website (www.example.com/KitchenAppliances ) with keyword rich metatags as well as content with internal links to my product pages. I also plan on sending traffic to these pages to build the pages link popularity. Do you think I can expect better results for the article pages than my original website product pages or do you think I should continue with the link building activity I was performing originally for the website. regards Ryan
Intermediate & Advanced SEO | | SEO5Team2