Does Google throttle back the search performance of a penalised website/page after the penalty has been removed?
-
Hi Mozzers.
Back in 2013 my website www.octopus-hr.co.uk was hit by a Penguin 2.0 penalty owing to a harmful backlink profile built by a dodgy SEO consultant (now fired). The penalty seemed to apply to the homepage of the site but other pages were unaffected.
We got what links we could removed, disavowed the rest and were informed in September 2013 that the penalty had been removed and our re-inclusion request had been successful. However our website homepage still ranks poorly for the search terms we're targeting in the UK: "HR Software" "HR Systems"
On page factors are in my opinion pretty well optimised for these search terms. In terms of link building post penalty we've focused on high authority and relevant sites. I believe that compared to most of our search competitors the back link profile to our homepage is in pretty good shape, however it still ranks badly.
Has anyone had any experience of a penalty hangover from Google in the past? Are there other things I should consider?
Thanks
David
-
Remember that a lot of links you had are now no longer, removed or disavowed you cant expect to rank the same.
Also how long has it been since you had penalty lifted. It may take a while for all things to equal up again. -
If reconsideration worked and you got a reply from Google, it's likely that you were facing a manual penalty (either instead of or in addition to Penguin). So, it may be that Penguin or some other algorithmic penalty is in play (echoing what Andy said).
Once a penalty expires or is lifted, I'm unaware of any kind of dampening on the site (like, 50% penalty for 3 months and then 25%, etc.). This is much more likely to be a situation where you have multiple layers of problems (some could be technical, etc., and not penalties) and you've removed just the top layer.
-
if you drop me a quick mail over David (address above) I can give you a little more detail. Wouldn't be something I could do here.
-Andy
-
Thanks Andy for your kind offer, if you're happy to have a quick look at our link profile your feedback would be very much appreciated.
-
Hi David,
I can run a quick scan for you and tell you what sort of shape your link profile is in, but what 'could' have happened is that since the penalty, another algorithm has come along and hit you for something else. A little awkward to guess exactly though, but have seen this happen on a number of occasions.
Edit-- OK, your link profile isn't too healthy I'm afraid David. One even listed as Malicious... If you want to drop me a mail over, I will give you a little more info info@inetseo.co.uk
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Shopify Website Page Indexing issue
Hi, I am working on an eCommerce website on Shopify.
Intermediate & Advanced SEO | | Bhisshaun
When I tried Indexing my newly created service pages. The pages are not getting indexed on Google.
I also tried manual indexing of each page and submitted a sitemap but still, the issue doesn't seem to be resolved. Thanks0 -
How to index your website pages on Google 2020 ?
Hey! Hopefully, everyone is fine here I tell you some step how you are index your all website pages on Google 2020. I'm already implementing these same steps for my site Boxes Maker. Now Below I'm giving you some steps for indexing your website pages. These are the most important ways to help Google find your pages: Add a sitemap. ... Make sure people know your site. ... Ensure full navigation on your site. ... Apply the indexing application to your homepage. ... Sites that use URL parameters other than URLs or page names may be more difficult to broadcast.
Intermediate & Advanced SEO | | fbowable0 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
What can you do when Google can't decide which of two pages is the better search result
On one of our primary keywords Google is swapping out (about every other week) returning our home page, which is more transactional, with a deeper more information based page. So if you look at the Analysis in Moz you get an almost double helix like graph of those pages repeatedly swapping places. So there seems to be a bit of cannibalizing happening that I don't know how to correct. I think part of the problem is the deeper page would ideally be "longer" tail searches that contain the one word keyword that is having this bouncing problem as a part of the longer phrase. What can be done to try prevent this from happening? Can internal links help? I tried adding a link on that term to the deeper page to our homepage, and in a knee jerk reaction was asked to pull that link before I think there was really any evidence to suggest that that one new link made a positive or negative effect. There are some crazy theories floating around at the moment, but I am curious what others think both about if adding a link from a informational to a transactional page could in fact have a negative effect, and what else could be done/tried to help clarify the difference between the two pages for the search engines.
Intermediate & Advanced SEO | | plumvoice0 -
Google Is Indexing The Wrong Page For My Keyword
For a long time (almost 3 mounth) google indexing the wrong page for my main keyword.
Intermediate & Advanced SEO | | Tiedemann_Anselm
The problem is that each time google indexed another page each time for a period of 4-7 days, Sometimes i see the home page, sometimes a category page and sometimes a product page.
It seems though Google has not yet decided what his favorite / better page for this keyword. This is the pages google index: (In most cases you can find the site on the second or third page) Main Page: http://bit.ly/19fOqDh Category Page: http://bit.ly/1ebpiRn Another Category: http://bit.ly/K3MZl4 Product Page: http://bit.ly/1c73B1s All links I get to the website are natural links, therefore in most cases the anchor we got is the website name. In addition I have many links I get from bloggers that asked to do a review on one of my products, I'm very careful about that and so I'm always checking the blogger and their website only if it is something good, I allowed it. also i never ask for a link back (must of the time i receive without asking), and as I said, most of their links are anchor with my website name. Here some example of links that i received from bloggers: http://bit.ly/1hF0pQb http://bit.ly/1a8ogT1 http://bit.ly/1bqqRr8 http://bit.ly/1c5QeC7 http://bit.ly/1gXgzXJ Please Can I get a recommendation what should you do?
Should I try to change the anchor of the link?
Do I need to not allow bloggers to make a review on my products? I'd love to hear what you recommend,
Thanks for the help0 -
Street Address Not Appearing on Business Google+ Page
I run a local business in New York City, a commercial real estate brokerage. My firm has both a web site and Google+ accounts, one Google+ account for me personally and a Google+ account for my business. Under address my Google+ account is showing New York, NY. It is not showing a street address. Similiarly when my business name is entered in the Google search bar, my web site is the first result, but under address (directly to the right of a black dot with a grey circle around it) "New York, NY" with the phone number beneath it appears. No sign of my street address. My business is registered under Google Places and we have entered the correct street address. Any ideas on how I can get Google to display our street address? This is obviously very, very detrimental for local SEO. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Sitelinks: Does Google Recognize Your Requests for Removal?
I've been trying to influence branded SERPs recently by demoting certain pages from appearing in the Sitelinks feature provided in Google's Webmaster Tools. However, despite demoting various URLs, they continue to appear for the branded SERPs nearly a week after they should've been suppressed. What is your experience with Sitelinks? Do links you request to demote ever disappear or change positions in the SERPs for you?
Intermediate & Advanced SEO | | eMagineSEO0