Disavow Issues
-
Hi
We have a client who was hit by Penguin about 18 months ago.
We disavowed all the bad links about 10 months ago however this has not resulted in an uplift in traffic or rankings.
The client is asking me whether it would be better to dump the domain and move the website to a fresh domain.
Can you provide thoughts / experience on this please?
Thanks.
-
Just wanted to clarify (for the sake of others who may read this post) that the question was in regards to Penguin and I think in your situation, you're dealing with manual penalties. With Penguin, there is no reconsideration request. You've got to clean up the best you can and then hope that things improve when Google refreshes the Penguin algorithm.
It's still up for debate whether removing links (as opposed to disavowing) is important for Penguin. My current advice is that if a link is easy to remove then do it. But, otherwise I disavow. While you're right that it is important to show Google your efforts in regards to link removal for a manual penalty, no one is going to look at your work for an algorithmic issue.
I asked John Mueller in a hangout once whether disavowing was as good as removing for Penguin and he said, "essentially yes". However, because there are potential problems that could come up with the disavow tool (such as improper formatting or taking too long to recrawl to disavow), if you can remove the link that's not a bad thing to do.
-
Hi Paul,
I realise it's been a couple of weeks since this was submitted, but I wanted to follow up. At my former agency, we went through a few reconsideration procedures for new clients. We managed to be successful with all of them, but some took quite a long time (August - February being the longest).
We have found that disavowing alone is not nearly enough to make a difference - it is far preferable for the links to be removed. Unlike Claudio below, we have had a far higher rate than 5%, but it all depends on where the links come from. Sometimes it's hard to even find a live email address to contact webmasters, and some people want payment to remove links (worth doing if the payment is not too high). We crafted templates and _always _followed up within two weeks if we did not get a response from first emailing someone for a link removal with another specifically crafted email template.
It's true that if you cannot remove links, it is still worthwhile demonstrating to Google that you attempted to do so, with email screenshots or at least a list of the sites you contacted. They want to see effort. They want to see that you removed, or attempted to remove, the vast majority of the bad links. It's time consuming and tedious, but it's worth it if you get the penalty removed.
As I said, the longest process we went through was over six months, but the site in question had a TERRIBLE backlink profile that was the result of years of abuse by bad link builders. We're talking removing thousands of links. However, it came through - the penalty was removed and the client's rankings are on the rise.
I hope this helps. The short version is: remove remove remove. You won't maintain a penalty if there are no more bad links holding the site back, and those links aren't helping it rank anyway.
If you'd like some advice on how to decide which links to remove and which to keep, please let me know. In the meantime, check out this post from my former colleague Brandon at Ayima. It's a good resource for link analysis.
Cheers,
Jane
-
Does the site have a good base of truly natural links? There have been very few reported cases of Penguin recovery. But, the ones that I have seen recover are ones that have had some excellent links left once the bad ones were cleaned up.
-
Did you have a manual penalty? Did you get it revoked? or did you assume you had a Penguin issue and were proactive about it to avoid a manual penalty?
-
Recovery from Link Penalty (manual or algorithm) procedure:
1. Collect inboud links from Google Webmaster Tools + Moz link explorer + Link Majestic.
2. Include all domains in a Excel worksheet.
3. Contact site owners asking for link removal (usually 5% of sucess, but the effort counts for Google).
4. Wait several weeks for the removal of the links.
5. Fill a disavow file and upload it to Google https://www.google.com/webmasters/tools/disavow-links-main?pli=1
6. Wait for 3 or 6 weeks and start a link building campain starting with a few links per week and increase it if you can (only natural links comming from authority sites related to your niche).
Recovers from Content problems.
1. Look for repetitive title and descriptions, use Google Webmaster Tools and Moz.
2. Look for pages with similar or identical content and fix it.
3. Look for pages with less than 200 words of convent and add content or simply remove them (404).
4. Add new fresh and original content.
Google will consider your effort and it will be increasing your rank step by step.
I hope it helps
Claudio
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google serp pagination issue
We are a local real estate company and have landing pages for different communities and cities around our area that display the most recent listings. For example: www.mysite.com/wa/tumwater is our landing page for the city of Tumwater homes for sale. Google has indexed most of our landing pages, but for whatever reason they are displaying either page 2, 3, 4 etc... instead of page 1. Our Roy, WA landing page is another example. www.mysite.com/wa/roy has recently been showing up on page 1 of Google for "Roy WA homes for sale", but now we are much further down and www.mysite.com/wa/roy?start=80 (page 5) is the only page in the serps. (coincidentally we no longer have 5 pages worth of listings for this city, so this link now redirects to www.mysite.com/wa/roy.) We haven't made any major recent changes to the site. Any help would be much appreciated! *You can see what my site is in the attached image... I just don't want this post to show up when someone google's the actual name of the business 🙂 nTTrSMx.jpg C4mhfgh.jpg
Technical SEO | | summithomes0 -
Fetch as Google issues
HI all, Recently, well a couple of months back, I finally got around to switching our sites over to HTTPS://. In terms of rankings etc all looks fine and we have not move about much, only the usual fluctuations of a place or two on a daily basis in a competitive niche. All links have been updated, redirects in place, the usual https domain migration stuff. I am however, troubled by one thing! I cannot for love nor money get Google to fetch my site in GSC. No matter what I have tried it continues to display "Temporarily unreachable". I have checked the robots.txt and it is on a new https:// profile in GSC. Has anyone got a clue as I am stumped! Have I simply become blinded by looking too much??? Site in Q. caravanguard co uk. Cheers and looking forward to your comments.... Tim
Technical SEO | | TimHolmes0 -
Tricky Duplicate Content Issue
Hi MOZ community, I'm hoping you guys can help me with this. Recently our site switched our landing pages to include a 180 item and 60 item version of each category page. They are creating duplicate content problems with the two examples below showing up as the two duplicates of the original page. http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=180&p=1 http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=60&p=1 The original page is http://www.uncommongoods.com/fun/wine-dine/beer-gifts I was just going to do a rel=canonical for these two 180 item and 60 item pages to the original landing page but then I remembered that some of these landing pages have page 1, page 2, page 3 ect. I told our tech department to use rel=next and rel=prev for those pages. Is there anything else I need to be aware of when I apply the canonical tag for the two duplicate versions if they also have page 2 and page 3 with rel=next and rel=prev? Thanks
Technical SEO | | znotes0 -
What is best practice for fixing urls that have duplicate content, non-static and other issues?
Hi, I know there are several good answers regarding duplicate content issues on this website already, however I have a question that involves the best way to avoid negative SEO impacts if I change the urls for an ecommerce site. Basically a new client has the following website http://www.gardenbeauty.co.uk and I notice that it suffers from duplicate content due to the http://www version and the non www version of the pages - this seems quite easy to fix using the guidance on this website. However I notice that the product page urls are far from ideal in that they have several issues including:- (a) they are mostly too long (b) don't include the keyword terms (in terms of best practice) (c) they don't use Static URLS An example of one these product urls would be http://www.gardenbeauty.co.uk/plant-details.php?name=Autumn Glory&p_genus=Hebe&code=heagl&category=hebe I'd like to address these issues, but the pages rank highly for the products themselves, therefore my question is what would you recommend I do to fix the urls without risking the high positions that many of these product pages have? thanks, Ben
Technical SEO | | bendyman0 -
Disavowing Links: Over-optimized Anchor Text
If a site has been effected by Penguin, and the main issue is over-optimized anchor text. Would the disavow file take care of this? If there is a no-follow added to the link when Google next crawls the site, wouldn't you still have the optimized anchor text in your profile?
Technical SEO | | inhouseseo0 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Filter Tag Duplicate Content E-Commerce Issue
Hello, I just launched a new site for a client but am seeing some duplicate content issues in the campaign crawl. It has to do with the drill-down, filter "tags" that helps users find the product they are looking for. You can see them in the sidebar here: http://www.ssmd.com/shop/ In my crawl report this is what is showing up as duplicate content (attached image). How do I keep these widgets from generating duplicate content on the site? Also, not sure if it's important or not, but I am using Wordpress, WooCommerce and Yoast's SEO Tool. Any suggestions are appreciated! Screen%20Shot%202012-10-23%20at%202.56.00%20PM.png
Technical SEO | | kylehungate0 -
Is Noindex Enough To Solve My Duplicate Content Issue?
Hello SEO Gurus! I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article. Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well. My concern is duplicate content. In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit. I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution. Thanks in advance! Kind Regards, Mike
Technical SEO | | RCNOnlineMarketing0