For those of you that used LINK DETOX.
-
Did you go ahead and remove all the TOXIC and HIGH RISK links? Just the toxic?
Were you successful with the tool?
-
I've used Link Detox for a while. I've found that all Toxic links are classified appropriately, but that's not always the case for suspicious and healthy. Links marked as moderate, low risk, or very low risk can still be coming from spammy directories, link lists, massive resource pages, etc. And on the opposite side, sometimes links from BlogSpot blogs will be marked as suspicious even though it's a legit blog that linked naturally to you. They're just flagged because they have few backlinks or low trust. It's best to do a manual review of each link to see what's really worth keeping.
I've used Link Detox with Rmoov in the past and had good results with it. Rmoov is easy to use and keeps track of everything for you in a Google Doc.
-
I have looked at removeem, but their video says "find all the links with keywords in anchor text and remove them". That sounds like great way to lose even more rankings. Plus it doesn't really do anything. You are paying for basically making a spreadsheet from what I see.
I thought about using Link Detox with rmoov.com, or link delete.
-
I don't know if this is what you want because you did ask specifically about detox however if you're not using that tool one another option I have used
I have had a lot of success.
remove'em is also made by a company recommended by Moz called http://www.virante.org/
Hope this was of help,
Thomas
-
All of our Toxic links seemed to have good reason for being labeled such, and I did Disavow those domains. There is LOTS more info in the downloaded CSV files than ever shows on the Link Detox screen.
Some of the High Risk ones were kept. Usually it was the thing about them coming from a de-indexed site or blog. When I looked at the site or blog it appeared to be a nice site that just happened to get hammered by Panda or Penguin. I figured that the owner of the site was working hard on getting the penalty lifted and eventually it would be a good site again.
I also disavowed a few marginal domains to help re-balance my anchor text profile. Which was out of balance enough to get Penguined for the over-optimized keywords.
Successful? We will not know until Google runs a new Penguin update. Hopefully they do it real soon...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitewide links and owned site
Hi everyone, I need the community opinion on something. I am webmarketer and SEO for a pure player who runs a couple of e-commerce sites. On one side we have bigsite.com. It makes all our revenue. I have been in charge for years. Results are good. We have smallsite.com. It is starting. But small revenues for the moment. We have a new SEO working on this. My question is : We always had a banner on bigsite.com's homepage, sending valuable traffic to smallsite.com.T he new SEO, has footer sitewide links from smallsite.com to bigsite.com homepage. Considering both sites share same ssl, server and company name, I am quite sure this is out of google's guide lines and would hurt bigsite.com. Do you agree that this is wrong from the new SEO, and that it could hurt my work and the search results for bigsite.com and smallsite.com, as well as team work ? Thanks
Intermediate & Advanced SEO | | Kepass0 -
Wrong redirect used
Hi Folks,
Intermediate & Advanced SEO | | Patrick_556
I have a query & looking for some opinions. Our site migrated to https://
Somewhere along the line between the developer & hosting provided 302 redirect was implemented instead of the recommended 301 (the 301 rule was not being honured in the htaccess file.)
1 week passed, I noticed some of our key phrases disappear from the serps 😞 When investigated, I noticed this the incorrect redirect was implemented. The correct 301 redirect has now been implemented & functioning correctly. I have created a new https property in webmaster tools, Submitted the sitemap, Provided link in the robots.txt file to the https sitemap Canonical tags set to correct https. My gut feeling is that Google will take some time to realise the problem & take some time to update the search results we lost. Has anyone experienced this before or have any further thoughts on how to rectify asap.0 -
What Links to Disavow?
I am looking through my website's link profile that I pulled directly from Google Webmaster Tools. What is the best way to determine the links to disavow? Maybe the Webmaster Tools list is not the best list for this process but I really need to clean up the links that are hurting the site's SEO. Does anyone have any insight?
Intermediate & Advanced SEO | | PartyStore0 -
Linking to URLs With Hash (#) in Them
How does link juice flow when linking to URLs with the hash tag in them? If I link to this page, which generates a pop-over on my homepage that gives info about my special offer, where will the link juice go to? homepage.com/#specialoffer Will the link juice go to the homepage? Will it go nowhere? Will it go to the hash URL above? I'd like to publish an annual/evergreen sort of offer that will generate lots of links. And instead of driving those links to homepage.com/offer, I was hoping to get that link juice to flow to the homepage, or maybe even a product page, instead. And just updating the pop over information each year as the offer changes. I've seen competitors do it this way but wanted to see what the community here things in terms of linking to URLs with the hash tag in them. Can also be a use case for using hash tags in URLs for tracking purposes maybe?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Alternative Link Detox tools?
My company is conducting a link detox for a client, and it seems like every tool we utilize is giving us a different answer on how many links we actually have. the numbers range anywhere from 4,000 to 200,000. Does anyone have any suggestions as to what tools will give us an accurate count, and will also email the webmasters on your behalf requesting the links removal? We are trying to have this process be as automated as possible to save time on our end.
Intermediate & Advanced SEO | | lightwurx0 -
Has Anyone Used Boostability?
Looking into Boostabilty as an option for doing SEO for our clients, will still keep SEOmoz and will still be doing SEO for our own company. Has anyone used it or heard things about it? I am very skeptical when it comes to outsourcing SEO and when it comes to any kind of automated SEO but thought I'd ask if anyone had thoughts on it. Thanks, Holly
Intermediate & Advanced SEO | | hwade0 -
Flow of internal link equity
I've recently come across this: A site changes the URL of one internal page to something more search friendly, and 301's the old to the new as you would expect. They don't change the link on the homepage in the navigation. Instead they keep it to the old URL so they go through the 301 to get to the page even though it's internal. They say if they change the URL it will reset the internal flow of link equity to that page. I've not come across this before and so am not sure what to think. I mean I can see what they're saying but I would have though that it being internal would mean it's different and that the flow to internal pages would just kind of resume as-was quite soon afterwards. Any views?
Intermediate & Advanced SEO | | SteveOllington0 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1