Site Has Not Recovered (Still!) from Penguin
-
Hello,
I have a site that just keeps getting hit with a ton of bad, unnatural backlinks due to the sins from previous SEO companies they've hired. About every quarter, I have to add more bad links to their disavow file... still. Is it time to move them to a new domain? Perhaps a .net? If so, do we just completely trash the old domain & not redirect it? I've never had a client like this in the past but they still want to maintain their branded name.
Thanks for your feedback!
-
hey guys,
Any updates on the ahreflang tests?
I'm in a similar boat - one site got a manual hit in Feb2014...sitewide penalty, at one point brand name was even deindexed.
Got penalty lifted in 5 months. But traffic has not recovered one bit since then.
-
Please keep me posted what you decide to you. As awful as this is, it is nice to know we are not alone. We may just be rebranding and starting from scratch since Google has not provided any indication when they will release the next update. Also, I came across this a couple of weeks ago so it could be several months: http://searchengineland.com/google-we-are-working-on-making-the-penguin-updated-continuously-222247.
-
no need, all hreflang sites are linked in the eyes of Google. Also you still want people going to your original url which is where all your brand building and everything else was done.
At some point Penguin will refresh and your original site will regain the ability to rank. At that point you can decide what you want to do.
I had set mine up so that my co.uk was set to "EN" this capturred all english enquiries. Then in OCT 2014 my site regained its ability to rank and So I switched it so that my co.uk was set to "en-gb". The co.uk extension is an additional signal in google.co.uk for ranking and we are a truly global company with plans to expand into dedicated sites for certain countries. So it was a welcomed find.
However there is no harm in setting up the second site to target your main audience.
Lets say you are a company in the United States with a .com that is waiting for a penguin refresh.Get a second domain, point it to the same directory on your server. Do some clever coding to manage it so that you only have one set of code. (also think of a plan for the future to deliver separate content to both sites, maybe two database tables serving up content based on TLD).
Lets say you get a .netThen apply the following
Do this for each page making sure to keep the same url structure. (John mueller just started the importance of maintaining the same URL structure)
Once your original site regains the ability to rank again from an algo refresh go into webmaster tools and set the region of the new site to your desired specific location such as United States.
Then change the hreflang to:This way you are now telling Google to send only english searches from the US to your site and it will be a localized domain with a higher chance of ranking than the original site.
OR you could at that point remove hreflang and drop the newsite.net
This is the only way I was able to KEEP my original site up maintain my brand and all the people that go direct to my main brand website and also have a way of ranking again.
Its complicated to explain WHY it works but at some point I will write a big article for MOZ on this subject.
Hope that makes things more clear.
-
Yes the domain does not matter.
You can even test it out with just one single page. I spoke to John Mueller about this a while ago, he said when using hreflang you can use it in the same way you use a canonical tag. So maybe you could test it on an internal page that you know should be ranking better than it currently does.
Setup the new domain, create a blank index.php page and just replicate the internal page and URL structure.
FYI, John actually talked about URL structure for hreflang just 2 days ago.
https://www.youtube.com/watch?feature=player_embedded&v=1sewHcbKTJw#t=2171In an ideal world you want to create a clever bit of PHP so that your code is being pulled from one directory otherwise you will have 2 versions of your site that you will need to maintain and that would just be a royal pain in the a$$.
Before you do anything I suggest you read the page below and watch the video by Maile Ohye on there too.
https://support.google.com/webmasters/answer/189077?hl=enFell free to ask me any questions I have had to do this on a few sites and have been doing it for a long time now.
Setup your current site as the x-default and the new one as "EN" so that all english inquiries are served up by Google to your new site.
I look forward to hearing how it goes. FYI its possible you may see your results drop for a day or two and then popup with the old URL again and then the new one, It can take a few days for it to recalibrate. It can also happen right away and then random adjustments happen over the next few weeks.
Also make sure to use fetch as Google in WMT on both site pages to get Googlebot looking at your pages ASAP. I have seen results in 30 seconds before.
-
@Gary, Would this work between a .com & .net website? I'm willing to test anything at this point. Thanks!
-
@ruth
I urge you to try the hreflang solution sometime as a test.
"but it's not something I would test unless you actually do have different English-speaking audiences."
You can Always set the old domain to "x-default" and the new one to "EN" so that all English search results switch to the new site, this is great for sites not willing to wait up to a year for a Penguin refresh or affected by other SERPS suppression. Both sites can be identical and will not cause duplication issues. hreflang is amazing tool for testing.
-
Thanks Ruth for your feedback. I just wanted to address your 2 points above:
-
we actually have been adding all new links to our original disavow file, so we are all set there.
-
yes, we do understand that the loss of links did cause a drop in rankings. Due to this, we've actually started building out natural links throuhh outreach & PR, redesigned the entire site & updated all content as well Along with ongoing contemy creation on site and off.
With those things in mind, is there anything else left? I'm just wondering if we're completely missing something and in getting desperate - is this domain just dead?
And thanks for the heads up on the hreflang, that was a little over my head.
Thanks for your feedback
-
-
A couple of things to keep in mind:
- When disavowing new inbound links, make sure that you're adding them to your existing disavow file - if you just submit the new sites, that new disavow file will overwrite the previous one and un-disavow links.
- A manual penalty is only part of the traffic/ranking loss you'll see with Penguin. Don't forget that you also lost the link value of a bunch of spammy links that previously were providing value. The penalty may be gone, but so are your links! To regain traffic levels, you'll need to build new quality inbound links, so make sure that's a big part of your strategy going forward.
I haven't tried the hreflang solution, so I can't comment on its effectiveness, but it's not something I would test unless you actually do have different English-speaking audiences.
Good luck!
-
Thank you for the lengthy response! I am not sure how to use hreflang but will look into it more this week. As for the your final question, our manual penalty was revoked in Oct. 2013 & it just hasn't performed worth a damn since. We updated all content and are working on a design refresh now in order to support responsive design - hoping that would help as we'd hate to change the domain but we're just at a loss & getting desperate. This is out only client who came to use for this type of cleanup service that has not yet recovered.
-
I have the ultimate answer for you and you will not find this elsewhere online.
I have been through this process and it was a huge pain the a$$
I spoke with John Mueller at Google for years trying to resolve our issues until one day we spoke about hreflang. At which point John said to me that it would be OK to use it. So we played with it and it turned out that we recovered IMMEDIATELY from penguin for a new domain that was hrefland linked from our original site.
So basically this allowed us to keep our original brand name site up while traffic was going to our new co.uk site (in your case .net)
Let say for instance your client is US based. Take the .com site and set it as x-default, then set the new .net site as "en" or if you want to be more specific "en-US".
All Google traffic in english or english from google.com in the US will now start flowing in your non penalised site. the hreflang simply does a swap in those search engines but it happens before the penalty is enforced. So your rankings will return right away depending on where you would now rank after the penalty is lifted and your new content will also be back to their rightful ranking positions. Basically all the suppression is gone.
Dont worry about duplicate content as hreflang handles all that. Its very common for sites to have just a few small changes such as $ to £ or a few spelling changes like color and colour. There is no downside.
If you are clever with the way you code your site you can make it a seamless transition without having to maintain code on two domains. Only took me a few hours to code something up.
The best thing is it opens the door to targeting multiple regions of the world with other languages. (When you are ready).
Hope that answers your question.
Also FYI, when did your manual penalty get revoked? It can take up to a year for your site to be ready for the refresh to consider it OK to lift the suppression. Based on that you may have not been ready for the OCT 2014 refresh and may be waiting for the next one which could be a long way away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Quick SEO Audit of my site.
Hello, I hope you are doing great. I am working on a website that is related to flea collars for cats and dogs. And I want you to make a quick audit of the site where am I lacking. It could be great if you can help ASAP. You can view my site here :
Intermediate & Advanced SEO | | Request4peace0 -
Moving multiple Sites to One Site and SEO Impact/Ideas
Hi there, We are in the process of moving 2 sites with higher page authority to another site we own (that is our company brand), so essentially 3 sites into one. We're at risk of losing a lot of SEO from the original 2 sites that have all the product information. We are doing this since we merged companies a couple years back and need one web precense. Anyhow, the site launch date is in 3 months and the recommendation is to start moving content over prior to that for top pages, which is a big undertaking when we are launching all the pages again with new content, redeisgn and moving sites in 3 months. If it's the right move, we should do it, but I just wanted to get opinions on how others have handled something similiar when moving to a site with lower site authority and trying not to lose rankings.
Intermediate & Advanced SEO | | lauramrobinson320 -
Site migration from non canonicalized site
Hi Mozzers - I'm working on a site migration from a non-canonicalized site - I am wondering about the best way to deal with that - should I ask them to canonicalize prior to migration? Many thanks.
Intermediate & Advanced SEO | | McTaggart0 -
Redirecting non www site
Hello Ladies and Gentlemen. I 100% agree with the redirecting of the non www domain name. After all we see so many times, especially in MOZ how the two different domains contain different links, different DA and of course different PA. So I have posed the question to our IT company, "How would we go about redirecting our non www domain to the www version?", "Where would we do that?", " we cant do the redirect on our webserver because the website is listed as an IP address, not a domain name, so would we do the redirect somewhere at GoDaddy?" who is currently maintain our DNS record So here is the response from IT: " I would setup a CNAME record in DNS (GoDaddy), such that no matter if you go to the bare domain, or the www, you end up in the same place. As for SEO, having a 301 redirect for your bare domain isn't necessary, because both the bare domain and the www are the same domain. 301 is a redirect for "permanently moved" and is common when you change domain names. Using the bare domain or the www are NOT DIFFERENT DOMAINS, so the 301 would not be accurate, and you'd be telling engines you've moved, when you haven't - which may negatively impact your rank. It sounds to me that IT is NOT recommending the redirect. How can this be? Or are we talking about two different things? Will the redirect cause the melt down as the IT company suggests? Or do they nut understand SEO?
Intermediate & Advanced SEO | | Davenport-Tractor0 -
Dev Site Out of SERP But Still Indexed
One of our dev sites get indexed (live site robots.txt was moved to it, that has been corrected) 2-3 weeks ago. I immediately added it to our Webmaster Tools and used the Remove URL tool to get the whole thing out of the SERPs. A site:devurl search in Google now returns no results, but checking Index Status in WMT shows 2,889 pages of it still indexed. How can I get all instances of it completely removed from Google?
Intermediate & Advanced SEO | | Kingof50 -
Optimal site structure for travel site
Hi there, I am seo-managing a travel website where we are going to make a new site structure next year. We have about 4000 pages on the site at the moment. The structure is only 2-levels at the moment: Level 1: Homepage Level 2: All other pages (4000 individual pages - (all with different urls)) We are adding another 2-3 levels, but we have a challenge: We have potentially 2 roads to the same product (e.g. "phuket diving product") domain.com/thailand/activities/diving/phuket-diving-product.asp domain.com/activities/diving/thailand/phuket-diving-product.asp I would very much appreciate your view on the problem: How do I solve this dilemma/challenge from a SEO standpoint? I want to avoid DC if possible, I also only want one landing page - for many reasons. And usability is of course also very important. Best regards, Chris
Intermediate & Advanced SEO | | sembseo0 -
Ranking a site in the USA
I'm UK based and looking at setting up a site to rank in the USA. As I understand it a .com TLD is best but these are used worldwide so do I simply need to set the geotargeting to USA in webmaster tools? Or is there a better domain to use? With hosting the site in US and on page content related to US cities (I plan to create a page for each US city I operate in the the city name in the H1 tag) will that be enough for google to understand that the page should rank in the US version of google. Also how can I view Google USA search results - when I go to google.com it automatically redirects to google.co.uk and I can only change the location on the left hand side to UK cities. Any help much appreciated!
Intermediate & Advanced SEO | | SamCUK0 -
Penguin Update Issues.. What would you recommend?
Hi, We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%. We suspect it's for a couple of reasons 1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week 2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle? 3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com. Any help will be much appreciated as this is Killing our business. Jay
Intermediate & Advanced SEO | | ConservationM0