Site Has Not Recovered (Still!) from Penguin
-
Hello,
I have a site that just keeps getting hit with a ton of bad, unnatural backlinks due to the sins from previous SEO companies they've hired. About every quarter, I have to add more bad links to their disavow file... still. Is it time to move them to a new domain? Perhaps a .net? If so, do we just completely trash the old domain & not redirect it? I've never had a client like this in the past but they still want to maintain their branded name.
Thanks for your feedback!
-
hey guys,
Any updates on the ahreflang tests?
I'm in a similar boat - one site got a manual hit in Feb2014...sitewide penalty, at one point brand name was even deindexed.
Got penalty lifted in 5 months. But traffic has not recovered one bit since then.
-
Please keep me posted what you decide to you. As awful as this is, it is nice to know we are not alone. We may just be rebranding and starting from scratch since Google has not provided any indication when they will release the next update. Also, I came across this a couple of weeks ago so it could be several months: http://searchengineland.com/google-we-are-working-on-making-the-penguin-updated-continuously-222247.
-
no need, all hreflang sites are linked in the eyes of Google. Also you still want people going to your original url which is where all your brand building and everything else was done.
At some point Penguin will refresh and your original site will regain the ability to rank. At that point you can decide what you want to do.
I had set mine up so that my co.uk was set to "EN" this capturred all english enquiries. Then in OCT 2014 my site regained its ability to rank and So I switched it so that my co.uk was set to "en-gb". The co.uk extension is an additional signal in google.co.uk for ranking and we are a truly global company with plans to expand into dedicated sites for certain countries. So it was a welcomed find.
However there is no harm in setting up the second site to target your main audience.
Lets say you are a company in the United States with a .com that is waiting for a penguin refresh.Get a second domain, point it to the same directory on your server. Do some clever coding to manage it so that you only have one set of code. (also think of a plan for the future to deliver separate content to both sites, maybe two database tables serving up content based on TLD).
Lets say you get a .netThen apply the following
Do this for each page making sure to keep the same url structure. (John mueller just started the importance of maintaining the same URL structure)
Once your original site regains the ability to rank again from an algo refresh go into webmaster tools and set the region of the new site to your desired specific location such as United States.
Then change the hreflang to:This way you are now telling Google to send only english searches from the US to your site and it will be a localized domain with a higher chance of ranking than the original site.
OR you could at that point remove hreflang and drop the newsite.net
This is the only way I was able to KEEP my original site up maintain my brand and all the people that go direct to my main brand website and also have a way of ranking again.
Its complicated to explain WHY it works but at some point I will write a big article for MOZ on this subject.
Hope that makes things more clear.
-
Yes the domain does not matter.
You can even test it out with just one single page. I spoke to John Mueller about this a while ago, he said when using hreflang you can use it in the same way you use a canonical tag. So maybe you could test it on an internal page that you know should be ranking better than it currently does.
Setup the new domain, create a blank index.php page and just replicate the internal page and URL structure.
FYI, John actually talked about URL structure for hreflang just 2 days ago.
https://www.youtube.com/watch?feature=player_embedded&v=1sewHcbKTJw#t=2171In an ideal world you want to create a clever bit of PHP so that your code is being pulled from one directory otherwise you will have 2 versions of your site that you will need to maintain and that would just be a royal pain in the a$$.
Before you do anything I suggest you read the page below and watch the video by Maile Ohye on there too.
https://support.google.com/webmasters/answer/189077?hl=enFell free to ask me any questions I have had to do this on a few sites and have been doing it for a long time now.
Setup your current site as the x-default and the new one as "EN" so that all english inquiries are served up by Google to your new site.
I look forward to hearing how it goes. FYI its possible you may see your results drop for a day or two and then popup with the old URL again and then the new one, It can take a few days for it to recalibrate. It can also happen right away and then random adjustments happen over the next few weeks.
Also make sure to use fetch as Google in WMT on both site pages to get Googlebot looking at your pages ASAP. I have seen results in 30 seconds before.
-
@Gary, Would this work between a .com & .net website? I'm willing to test anything at this point. Thanks!
-
@ruth
I urge you to try the hreflang solution sometime as a test.
"but it's not something I would test unless you actually do have different English-speaking audiences."
You can Always set the old domain to "x-default" and the new one to "EN" so that all English search results switch to the new site, this is great for sites not willing to wait up to a year for a Penguin refresh or affected by other SERPS suppression. Both sites can be identical and will not cause duplication issues. hreflang is amazing tool for testing.
-
Thanks Ruth for your feedback. I just wanted to address your 2 points above:
-
we actually have been adding all new links to our original disavow file, so we are all set there.
-
yes, we do understand that the loss of links did cause a drop in rankings. Due to this, we've actually started building out natural links throuhh outreach & PR, redesigned the entire site & updated all content as well Along with ongoing contemy creation on site and off.
With those things in mind, is there anything else left? I'm just wondering if we're completely missing something and in getting desperate - is this domain just dead?
And thanks for the heads up on the hreflang, that was a little over my head.
Thanks for your feedback
-
-
A couple of things to keep in mind:
- When disavowing new inbound links, make sure that you're adding them to your existing disavow file - if you just submit the new sites, that new disavow file will overwrite the previous one and un-disavow links.
- A manual penalty is only part of the traffic/ranking loss you'll see with Penguin. Don't forget that you also lost the link value of a bunch of spammy links that previously were providing value. The penalty may be gone, but so are your links! To regain traffic levels, you'll need to build new quality inbound links, so make sure that's a big part of your strategy going forward.
I haven't tried the hreflang solution, so I can't comment on its effectiveness, but it's not something I would test unless you actually do have different English-speaking audiences.
Good luck!
-
Thank you for the lengthy response! I am not sure how to use hreflang but will look into it more this week. As for the your final question, our manual penalty was revoked in Oct. 2013 & it just hasn't performed worth a damn since. We updated all content and are working on a design refresh now in order to support responsive design - hoping that would help as we'd hate to change the domain but we're just at a loss & getting desperate. This is out only client who came to use for this type of cleanup service that has not yet recovered.
-
I have the ultimate answer for you and you will not find this elsewhere online.
I have been through this process and it was a huge pain the a$$
I spoke with John Mueller at Google for years trying to resolve our issues until one day we spoke about hreflang. At which point John said to me that it would be OK to use it. So we played with it and it turned out that we recovered IMMEDIATELY from penguin for a new domain that was hrefland linked from our original site.
So basically this allowed us to keep our original brand name site up while traffic was going to our new co.uk site (in your case .net)
Let say for instance your client is US based. Take the .com site and set it as x-default, then set the new .net site as "en" or if you want to be more specific "en-US".
All Google traffic in english or english from google.com in the US will now start flowing in your non penalised site. the hreflang simply does a swap in those search engines but it happens before the penalty is enforced. So your rankings will return right away depending on where you would now rank after the penalty is lifted and your new content will also be back to their rightful ranking positions. Basically all the suppression is gone.
Dont worry about duplicate content as hreflang handles all that. Its very common for sites to have just a few small changes such as $ to £ or a few spelling changes like color and colour. There is no downside.
If you are clever with the way you code your site you can make it a seamless transition without having to maintain code on two domains. Only took me a few hours to code something up.
The best thing is it opens the door to targeting multiple regions of the world with other languages. (When you are ready).
Hope that answers your question.
Also FYI, when did your manual penalty get revoked? It can take up to a year for your site to be ready for the refresh to consider it OK to lift the suppression. Based on that you may have not been ready for the OCT 2014 refresh and may be waiting for the next one which could be a long way away.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How are these sites ranking!?!
One of our clients is in the insurance industry and over the last 12 months we have seen an increasing number of low quality, newly registered, spammy sites achieving top 5 rankings for major keywords, which in turn is having an adverse effect on the rankings for our client. Does anyone have any idea how the following sites have managed to do this: http://www.multiquotetaxi.co.uk/ - 2nd for taxi insurance http://www.motortradefast.co.uk/ - 1st for motor trade insurance http://www.traders-insurance.com/ - 3rd for motor trade insurance http://www.multiquotefleet.co.uk/ - 1st for fleet insurance We have tried reporting the above sites, tried holding out to see if they get penalised and tried figuring out what they have done ourselves but cannot see how they have managed it. Any ideas at all?
Intermediate & Advanced SEO | | instinctive0 -
Mobile version of my sites: What is better?
What is the best approach to make my sites ready for mobile, in terms of SEO ? Is it better to create a subdomain called "m.mydomain.com" and redirect mobile users to that domain with a lite version of my sites? Or is it better to just keep the same domain as for my desktop version "mydomain.com" and use a WordPress theme that fits for all gadgets, for example Twenty Fourteen WordPress Theme, that adapts to each device? I see that most big sites use a "m.mydomain.com" subdomain for the mobile version, however, I don't see any sense in creating a subdomain of the site, when you can just use the WP adapting theme in the main domain. Any insight please? Thanks!
Intermediate & Advanced SEO | | BloggerGuy0 -
Site duplication issue....
Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago. The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site. My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.
Intermediate & Advanced SEO | | MiroAsh0 -
What this site is doing? Does it look like cloaking to you?
Hi here, I was studying our competitors SEO strategies, and I have noticed that one of our major competitors has setup something pretty weird from a SEO stand point for which I would like to know your thoughts about because I can't find a clear explanation for it. Here is the deal: the site is musicnotes.com, and their product pages are located inside the /sheetmusic/ directory, so if you want to see all their product pages indexed on Google, you can just type in Google: site:musicnotes.com inurl:/sheetmusic/ Then you will get about 290,000 indexed pages. No, here is the tricky part: try to click on one of those links, then you will get a 302 redirect to a page that includes a meta "noindex, nofollow" directive. Isn't that pretty weird? Why would they want to "nonidex, nofollow" a page from a 302 redirect? And how in the heck the redirecting page is still in the index?!! And how Google can allow that?! All this sounds weird to me and remind me spammy techniques of the 90s called "cloaking"... what do you think?
Intermediate & Advanced SEO | | fablau0 -
Redirection strategy for mobile site
Hello folks! I am just about to launch a mobile specific version of our website. We were not able to make the main site responsive so have decided to make a seperate copy on an m dot subdomain. I have kept the url structure identical between both sites and added a canonical url on the mobile pages pointing to the desktop site. I will detect and redirect all mobile devices and googlebot mobile crawler to the m dot site. The questions i have are as follows... Is that the best approach if you use a mobile specific site on a seperate subdomain? What type of redirects should i use to send mobile users (and googlebot mobile) to the mobile site? My mobile site does not have all the pages the desktop site has. What happens if i redirect a mobile user from a page on the desktop site to a page on the mobile site that does not exist? (will give 404 currently). I guess i could maintain a list of valid mobile urls but this would be a pain (and a bit of an overhead) Your help is most appreciated Regards
Intermediate & Advanced SEO | | RobertHill0 -
Network Of Sites...
Hi Guys, Just wondering if anyone can help me out... We have recently been hit by the Google penguin update and I'm currently working though all the bad / spammy backlinks that previous SEO companies have built for us. I have come across 1 particular domain www.justgoodcars.com they seem to have a lot of different domain names: <colgroup><col width="390"></colgroup>
Intermediate & Advanced SEO | | ScottBaxterWW
| http://www.justpulsarcars.com/nissan-pulsar-warranties/1/United_Kingdom/all.html |
| http://www.justpumacars.com/ford-puma-warranties/1/United_Kingdom/all.html |
| http://www.justpuntocars.com/dutch-site/fiat-punto-warranties/1/United_Kingdom/all.html?selectcountry1=United_Kingdom |
| http://www.justpuntocars.com/fiat-punto-warranties/1/United_Kingdom/all.html?selectcountry1=United_Kingdom | Now all of theses domains names have exactly the same IP Address?? Above is just a few I would say there are 100s of them. Do you think this could have an affect on us? Thanks, Scott0 -
How to best utilize network of 50 sites to increase traffic on main site
Hey All, First off I wanna thank everyone who has responded to all my previous questions! Love to see a community that is so willing to help those who are learning the ropes! Anyways back to my point. We have a main site that is a PR 3 and our main focal point for lead generation. We recently acquired 50 additional sites (all with a PR of 1-3) that we would like to use as our own little back linking campaign with. All the domains are completely relevant to our main site as well as specific pages within our main site. I know that reciprocal links will get me no where and that google is quickly on to the attempted 3 way link exchange. My question is how do I best link these 50 sites to not only maintain there own integrity and PR but also assist our main site. Thanks All!
Intermediate & Advanced SEO | | deuce1s0 -
Key page of site not ranking at all
Our site has the largest selection of dog clothes on the Internet. We're been (every so slowly) creeping up in the rankings for the "dog clothes" term, but for some reason only rank for our home page. Even though the home page (and every page on the domain) has links pointing to our specific Dog Clothes page, that page doesn't even rank anywhere when searching Google with "dog clothes site:baxterboo.com". http://www.google.com/webhp?source=hp&q=dog+clothes+site:baxterboo.com&#sclient=psy&hl=en&site=webhp&source=hp&q=dog+clothes+site:baxterboo.com&btnG=Google+Search&aq=f&aqi=&aql=&oq=dog+clothes+site:baxterboo.com&pbx=1&bav=on.2,or.r_gc.r_pw.&fp=f4efcaa1b8c328f Pages 2+ of product results from that page rank, but not the base page. It's not excluded in robots.txt, All on site links to that page use the same URL. That page is loaded with more text that includes the keywords. I don't believe there's duplicated content. What am I missing? Has the page somehow been penalized?
Intermediate & Advanced SEO | | BBPets0