Site Has Not Recovered (Still!) from Penguin
-
Hello,
I have a site that just keeps getting hit with a ton of bad, unnatural backlinks due to the sins from previous SEO companies they've hired. About every quarter, I have to add more bad links to their disavow file... still. Is it time to move them to a new domain? Perhaps a .net? If so, do we just completely trash the old domain & not redirect it? I've never had a client like this in the past but they still want to maintain their branded name.
Thanks for your feedback!
-
hey guys,
Any updates on the ahreflang tests?
I'm in a similar boat - one site got a manual hit in Feb2014...sitewide penalty, at one point brand name was even deindexed.
Got penalty lifted in 5 months. But traffic has not recovered one bit since then.
-
Please keep me posted what you decide to you. As awful as this is, it is nice to know we are not alone. We may just be rebranding and starting from scratch since Google has not provided any indication when they will release the next update. Also, I came across this a couple of weeks ago so it could be several months: http://searchengineland.com/google-we-are-working-on-making-the-penguin-updated-continuously-222247.
-
no need, all hreflang sites are linked in the eyes of Google. Also you still want people going to your original url which is where all your brand building and everything else was done.
At some point Penguin will refresh and your original site will regain the ability to rank. At that point you can decide what you want to do.
I had set mine up so that my co.uk was set to "EN" this capturred all english enquiries. Then in OCT 2014 my site regained its ability to rank and So I switched it so that my co.uk was set to "en-gb". The co.uk extension is an additional signal in google.co.uk for ranking and we are a truly global company with plans to expand into dedicated sites for certain countries. So it was a welcomed find.
However there is no harm in setting up the second site to target your main audience.
Lets say you are a company in the United States with a .com that is waiting for a penguin refresh.Get a second domain, point it to the same directory on your server. Do some clever coding to manage it so that you only have one set of code. (also think of a plan for the future to deliver separate content to both sites, maybe two database tables serving up content based on TLD).
Lets say you get a .netThen apply the following
Do this for each page making sure to keep the same url structure. (John mueller just started the importance of maintaining the same URL structure)
Once your original site regains the ability to rank again from an algo refresh go into webmaster tools and set the region of the new site to your desired specific location such as United States.
Then change the hreflang to:This way you are now telling Google to send only english searches from the US to your site and it will be a localized domain with a higher chance of ranking than the original site.
OR you could at that point remove hreflang and drop the newsite.net
This is the only way I was able to KEEP my original site up maintain my brand and all the people that go direct to my main brand website and also have a way of ranking again.
Its complicated to explain WHY it works but at some point I will write a big article for MOZ on this subject.
Hope that makes things more clear.
-
Yes the domain does not matter.
You can even test it out with just one single page. I spoke to John Mueller about this a while ago, he said when using hreflang you can use it in the same way you use a canonical tag. So maybe you could test it on an internal page that you know should be ranking better than it currently does.
Setup the new domain, create a blank index.php page and just replicate the internal page and URL structure.
FYI, John actually talked about URL structure for hreflang just 2 days ago.
https://www.youtube.com/watch?feature=player_embedded&v=1sewHcbKTJw#t=2171In an ideal world you want to create a clever bit of PHP so that your code is being pulled from one directory otherwise you will have 2 versions of your site that you will need to maintain and that would just be a royal pain in the a$$.
Before you do anything I suggest you read the page below and watch the video by Maile Ohye on there too.
https://support.google.com/webmasters/answer/189077?hl=enFell free to ask me any questions I have had to do this on a few sites and have been doing it for a long time now.
Setup your current site as the x-default and the new one as "EN" so that all english inquiries are served up by Google to your new site.
I look forward to hearing how it goes. FYI its possible you may see your results drop for a day or two and then popup with the old URL again and then the new one, It can take a few days for it to recalibrate. It can also happen right away and then random adjustments happen over the next few weeks.
Also make sure to use fetch as Google in WMT on both site pages to get Googlebot looking at your pages ASAP. I have seen results in 30 seconds before.
-
@Gary, Would this work between a .com & .net website? I'm willing to test anything at this point. Thanks!
-
@ruth
I urge you to try the hreflang solution sometime as a test.
"but it's not something I would test unless you actually do have different English-speaking audiences."
You can Always set the old domain to "x-default" and the new one to "EN" so that all English search results switch to the new site, this is great for sites not willing to wait up to a year for a Penguin refresh or affected by other SERPS suppression. Both sites can be identical and will not cause duplication issues. hreflang is amazing tool for testing.
-
Thanks Ruth for your feedback. I just wanted to address your 2 points above:
-
we actually have been adding all new links to our original disavow file, so we are all set there.
-
yes, we do understand that the loss of links did cause a drop in rankings. Due to this, we've actually started building out natural links throuhh outreach & PR, redesigned the entire site & updated all content as well Along with ongoing contemy creation on site and off.
With those things in mind, is there anything else left? I'm just wondering if we're completely missing something and in getting desperate - is this domain just dead?
And thanks for the heads up on the hreflang, that was a little over my head.
Thanks for your feedback
-
-
A couple of things to keep in mind:
- When disavowing new inbound links, make sure that you're adding them to your existing disavow file - if you just submit the new sites, that new disavow file will overwrite the previous one and un-disavow links.
- A manual penalty is only part of the traffic/ranking loss you'll see with Penguin. Don't forget that you also lost the link value of a bunch of spammy links that previously were providing value. The penalty may be gone, but so are your links! To regain traffic levels, you'll need to build new quality inbound links, so make sure that's a big part of your strategy going forward.
I haven't tried the hreflang solution, so I can't comment on its effectiveness, but it's not something I would test unless you actually do have different English-speaking audiences.
Good luck!
-
Thank you for the lengthy response! I am not sure how to use hreflang but will look into it more this week. As for the your final question, our manual penalty was revoked in Oct. 2013 & it just hasn't performed worth a damn since. We updated all content and are working on a design refresh now in order to support responsive design - hoping that would help as we'd hate to change the domain but we're just at a loss & getting desperate. This is out only client who came to use for this type of cleanup service that has not yet recovered.
-
I have the ultimate answer for you and you will not find this elsewhere online.
I have been through this process and it was a huge pain the a$$
I spoke with John Mueller at Google for years trying to resolve our issues until one day we spoke about hreflang. At which point John said to me that it would be OK to use it. So we played with it and it turned out that we recovered IMMEDIATELY from penguin for a new domain that was hrefland linked from our original site.
So basically this allowed us to keep our original brand name site up while traffic was going to our new co.uk site (in your case .net)
Let say for instance your client is US based. Take the .com site and set it as x-default, then set the new .net site as "en" or if you want to be more specific "en-US".
All Google traffic in english or english from google.com in the US will now start flowing in your non penalised site. the hreflang simply does a swap in those search engines but it happens before the penalty is enforced. So your rankings will return right away depending on where you would now rank after the penalty is lifted and your new content will also be back to their rightful ranking positions. Basically all the suppression is gone.
Dont worry about duplicate content as hreflang handles all that. Its very common for sites to have just a few small changes such as $ to £ or a few spelling changes like color and colour. There is no downside.
If you are clever with the way you code your site you can make it a seamless transition without having to maintain code on two domains. Only took me a few hours to code something up.
The best thing is it opens the door to targeting multiple regions of the world with other languages. (When you are ready).
Hope that answers your question.
Also FYI, when did your manual penalty get revoked? It can take up to a year for your site to be ready for the refresh to consider it OK to lift the suppression. Based on that you may have not been ready for the OCT 2014 refresh and may be waiting for the next one which could be a long way away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two sites with same content
Hi Everyone, I am having two listing websites. Website A&B are marketplaces Website A approx 12k listing pages Website B : approx 2k pages from one specific brand. The entire 2k listings on website B do exist on website A with the same URL structure with just different domain name. Just header and footer change a little bit. But body is same code. The listings of website B are all partner of a specific insurance company. And this insurance company pays me to maintain their website. They also look at the traffic going into this website from organic so I cannot robot block or noindex this website. How can I be as transparent as possible with Google. My idea was to apply a canonical on website B (insurance partner website) to the same corresponding listing from website A. Which would show that the best version of the product page is on website A. So for example :www.websiteb.com/productxxx would have a canonical pointing to : www.websitea.com/productxxxwww.websiteb.com/productyyy would have a canonical pointing to www.websitea.com/productyyyAny thoughts ? Cheers
Intermediate & Advanced SEO | | Evoe0 -
Recovering from spam links on MY site
Hey guys, Having a weird situation and wondering if anybody can help. I run a sizable WordPress site with a number of content writers. One of the writer's accounts was hacked and was used to post several dozens of complete spam posts with spun content and links to all sorts of shady sites. Recently the site has begun losing rankings on all sorts of pages. There's no manual penalty or anything, but I'm concerned that we're being penalized for having had these links on the site. Of course, as soon as we found the content, we immediately removed it, reset passwords, etc. But a decent number of the pages were indexed. Does anybody have any experience with this or ideas of what to do about it? Is there somewhere we can talk to Google about it or some way to show that we are not part of bad neighborhoods? Thanks so much for any thoughts, Yon
Intermediate & Advanced SEO | | yon230 -
Should I redirect images when I migrate my site
We are about to migrate a large website with a fair few images (20,000). At the moment we include images in the sitemap.xml so they are indexed by Google and drive traffic (not sure how I can find out how much though). Current image slugs are like:
Intermediate & Advanced SEO | | ArchMedia
http://website.com/assets/images/a2/65680/thumbnails/638x425-crop.jpg?1402460458 Like on the old site, images on the new website will also have unreadable cache slugs, like:
http://website.com/site_media/media/cache/ce/7a/ce7aeffb1e5bdfc8d4288885c52de8e3.jpg All content pages on the new site will have the same slugs as on the old site. Should I go through the trouble of redirecting all these images?0 -
Why is my site not ranked?
Hey, does enybody have an idea, why my site www.detox.si is not ranked for the KW detox in www.google.si (Slovenia). It is being indexed, but it does not rank and i have no idea why. Best, M.
Intermediate & Advanced SEO | | Spletnafuzija0 -
Site Wide Link Situation
Hi- We have clients who are using an e-commerce cart that sits on a separate domain that appears to be providing site wide links to our clients websites. Therefore, would you recommend disallowing the bots to crawl/index these via a robots.txt file, a no follow meta tag on the specific pages the shopping cart links are implemented on or implement no follow links on every shopping cart link? Thanks!
Intermediate & Advanced SEO | | RezStream80 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Site structure from an SEO standpoint
I am fortunate enough to be working with a client who is still building their website. From a site structure standpoint, what can I look for with my SEO hat as they build their wire frames and storyboard their site? I want to make sure I don't miss any components that might be helpful short and long term
Intermediate & Advanced SEO | | StreetwiseReports0 -
2 sites or one sites: 2 locations
Hello, I have a dog training client who is offering services in 2 separate locations. We're looking to be first in the non-local search results and also rank well in google places. Would it be better to go for 2 separate sites or one site and try to rank for 2 different locations with one site? There's both local and standard search results when we type in our keywords. Thanks!
Intermediate & Advanced SEO | | BobGW0