What is the best way to remove and fight back backlink spam?
-
Removing low quality and spam backlinks. What is the most effective clean-up process?
-
Hey Matti
Glad it helped buddy. Check out those tools, they won't do the job for you but they will certainly help out with some of the manual labour aspects.
Marcus
-
thank you for the response Marcus. so far it's not really that bad. I discovered that there were some pretty bizarre links that facilitated the atrophy of our rankings but not quite reach 90% spam links.Starting a new domain is far more than I think. I'll probably look at the tools that you suggested and see what's out there then.
Regards,
Matti
-
Hey Matti
In a nutshell, if it is really bad, then start again on a new domain.
What I am seeing with a few people I am helping is that where the site has had historical results, but is now penalised, attempting to clean up if the back link profile is pretty rotten (90% + placed links) is a pretty tough gig.
There are some tools out there that are proving useful and the pick of the bunch would be:
- rmoov
- Link Cleanup and Contact
- Remove’em
These all have pros and cons so you will likely want to use all of them.
Additionally, you will want to make sure the site is worth saving and likely invest some time and effort in generating some honest links through some solid content marketing. Maybe build some kind of free report or something specific to the site that you can use to do some outreach based link building. Do some blogging, invest some time and effort in the quality of the site.
Additionally, if you have a penalty, be prepared to put in a few requests and if you intend to disavow, be thorough.
With some experience here, you also have to ask yourself - what are you trying to save? If the answer to that question is that you are trying to save some spam links that still seem to be working at the moment, then, seriously, start again.
Without a link and some research it is hard to make a call but know this, it is a tough job to remove bad links and unless you have a link profile where there is something worth saving, then a new domain is likely the fastest way to sort out this mess and make sure you don't get hit again when they tighten up the link penalties down the road (you know it's going to happen).
There really is no generic answer here and every situation is different but be sure to know what you are getting yourself into before you undertake this and do an honest review of the site, the content and the links to make sure this is a battle you can win.
This is a good read:
http://cyrusshepard.com/penalty-lifted/
Hope that helps buddy
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I get spammy backlinks removed is it still necessary to disavow?
Now there is some conflicting beliefs here and I want to know what you think. If I got a high spam website to remove my backlink, is a disavow through search console still necessary ? Keep in mind if it helps even in the slightest to improve rankings im for it!
Technical SEO | | Colemckeon1 -
Best way to deal with 100 product pages
It feels good to be BACK. I miss Moz. I left for a long time but happy to be back! 🙂 My client is a local HVAC company. They sell Lennox system. Lennox provides a tool that we hooked up to that allows visitors to their site to 'see' 120+ different kind of air quality, furnace and AC units. They problem is (I think its a problem) is Google and other crawl tools are seeing these 100+ pages that are not unique, helpful or related to my client. There is a little bit of cookie cutter text and images and specs and that's it. Are these pages potentially hurting my client? I can't imagine they are helping. Best way to deal with these? Thank you! Thank you! Matthew
Technical SEO | | Localseo41440 -
Organizing A Backlink Authority Category Page
I work for a company that has many promotions throughout the year, some big, some HUGE. Typically they have created a landing page for this content. The issue is, when this promotion ends, we will kill the landing page, thus 404ing the backlinks and putting the page authority in purgatory. (1) What would be the best way the keep these pages organized? I was thinking about creating a main "Promotions" page with the current promotion on it (the previous ones linked on the bottom of the page). Then when the promotion ends I would copy those contents and add them to a new page and link to it from the original "promotions" page. An issue I see with this is that the promotions page would always have the same Title Tag and vanity URL. (2) This could provide many links to the "promotions" page over time to build it's authority, but would constantly changing content hurt ranking factors?
Technical SEO | | nat88han0 -
Company blog. What are the best solutions?
Hello Moz Community! Our company has its own blog (www.awarablogs.com) - the blog was created some time ago by means of a simple blog-engine. Now we see that the structure of the blog is bad for SEO (it has long URLs, many useless folders, subdomains and so on), so we'd like to simplify it. But the engine doesn't allow to change its structure in the way we 'd like to. Our webmaster suggested that we use "Alias". Will this method really help us make our blog SEO-friendly? Or is it better to choose another blog software like Wordpress? Thank you very much!
Technical SEO | | Awaraman0 -
Best way to deal with over 1000 pages of duplicate content?
Hi Using the moz tools i have over a 1000 pages of duplicate content. Which is a bit of an issue! 95% of the issues arise from our news and news archive as its been going for sometime now. We upload around 5 full articles a day. The articles have a standalone page but can only be reached by a master archive. The master archive sits in a top level section of the site and shows snippets of the articles, which if a user clicks on them takes them to the full page article. When a news article is added the snippets moves onto the next page, and move through the page as new articles are added. The problem is that the stand alone articles can only be reached via the snippet on the master page and Google is stating this is duplicate content as the snippet is a duplicate of the article. What is the best way to solve this issue? From what i have read using a 'Meta NoIndex' seems to be the answer (not that i know what that is). from what i have read you can only use a canonical tag on a page by page basis so that going to take to long. Thanks Ben
Technical SEO | | benjmoz0 -
What is the best way of inserting keywords into a social networking site
We have a social networking site called Yookos (www.yookos.com). As per the norm, many of the pages of the site carry user generated content and can only be accessed by the users themselves. i created a keyword list that our technical person the uploaded onto the site but SEOmoz shows that these keywords cannot be picked up.
Technical SEO | | seoworx123
What is the best way to make sure that our site is also picked up via search engine searches.0 -
What is the best strategy for a company in various countries?
Hello I have to make yt SEO marketing strategy for a company that provides services in Spain, Colombia and Mexico
Technical SEO | | interficto
I'm looking at two options: Buy different domains (TLD): This option seems feasible but very expensive and manage each domain position it would have to have different content in each (plus you would not know that because it is put exactly the same domain) Place each service and country folders eg
www.dominio.com / mexico / training-financiero.html
www.dominio.com / espana / training-financiero.html I have understood that option 1 is no longer necessary since you can use html tags within the code to tell Google that you try to target content to customers from a different country.
In principle we would use the same content would change only a few words and of course the currency to suit the local currency of each country. However I believe that customers could rely more on a domain if their country. Plus I'm afraid I google indexed as duplicate content is another matter What country would main that could confuse the visitor?0 -
Best XML Sitemap generator
Do you guys have any suggestions on a good XML Sitemaps generator? hopefully free, but if it's good i'd consider paying I am using a MAC so would prefer a online or mac version
Technical SEO | | kevin48030