Best to Leave Toxic Links or Remove/Disovow on Site with Low Number of Linking Domains
-
Our site has only 87 referring domains (with at least 7,100 incoming links). LinkDetox has identified 29% of our back links as being toxic and 14% as being questionable. Virtually all of these links derive from spammy sites.
We never received a manual penalty, but ever since the first Penguin penalty in 2012 our search volume and ranking has dropped with some uneven recover in the last 3 years.
By removing/disavowing toxic links are we risking that over optimized link text will be removed and that ranking will suffer as a result? Are we potentially shooting ourselves in the foot? Would we be better to spend a few months building quality links from reputable domains before removing disavowing bad links? Or toxic links (as defined by LinkDetox) so bad that it should be a priority to remove them immediately before taking any other step?
Thanks, Alan
-
If your site has in fact been negatively impacted by Penguin, you'll have to wait for the next Penguin refresh for your changes to have an impact. You may see fluctuations due to other algorithm changes though.
-
That is reassuring!!! My concern is that the number of domains that link to our site is so low (less than 100) that evening removing low quality links could be a negative.
In terms of seeing a ranking/traffic improvement, do you think we could expect it once the links are removed/disavowed or do you think it would occur only upon a Penguin update?
Thanks!!! Alan
-
Toxic, spammy links can only hurt you, but they are even more harmful if they make up a large percentage of your total backlinks. Remove/disavow them as soon as possible and work on earning legitimate, high-value backlinks. Also, over-optimized link text can be harmful as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Alternative domains redirected with 301 to the main domain
Hi everyone I've got a website which gained a Panda penalty back in March 2012 which was because of the implementation of a range of spammy practices (keyword stuffing in titles, indexed category and tag pages, duplicate domains). I've fixed the titles and deindexed any pages that could be seen as thin or duplicate so I'm confident that any onsite Panda issues have been fixed. As mentioned above the client had also created over 40 alternative domains to the website and pointed them to their main website folders (hence duplicating the website and content 40 times over). These domains have now been redirected via 301 redirects to the main website to ensure that any links they have gained are captured. The reason for the redirection is that we initially took the domains down and saw a drop in traffic and this seemed to be the most likely reason. While Moz and Majestic are not showing any significant links to these domains (which is why they where originally taken down), past experience has told me that these tools don't always pick up all referring domains. Primary domain workingvoices.com 5 Example Alternative Domains presentationskillslondon.com workingvoiceslive.biz workingvoices.co.uk livingvoices.co.uk working-voices.net Question 1: At the same time we took down the alternative domains (and experienced the drop in traffic) we removed duplicate instances of Google Analytics code from the webpages. All the guidance that we could find stated that duplicate instances of code shouldn't affect your Analytics numbers, hence we assumed it was the taking down of the alternative domains, but maybe the guidance we found was wrong? Question 2: It is 3 months later and these alternative domains are still indexed by Google, and Panda hasn't run since October 2014 so we haven't experienced a recovery yet. Redirecting the domains will remove any issue of a Panda penalty, but now of course I am worried about Penguin - the last thing I want to do is trigger that can of worms! This whole saga has been pretty complicated and I think I need some fresh sets of eyes. What does everyone think? Could the initial drop have been due to the duplicate Analytics code being removed? Could the redirecting domains trigger Penguin? Should we take the alternative domains down and be done with them? Any other thoughts! Looking forward to hearing your opinions! Damon.
Reporting & Analytics | | Digitator0 -
Analytics code removed and still collecting data
Google analytics code was removed from a website and then it started tracking a couple of days later to only stop again? How can that happen? Has the developer not removed the old code properly? Can the code be injected remotely?
Reporting & Analytics | | GardenBeet0 -
Can you track two Google Analytics Accounts on one site?
If you have a site that had an old analytics account and then implemented a new one is it possible to run tracking code that records to both accounts without causing your site or data issues? We are doing this so we don't loose data at any point - ideally it wouldn't have been split between the two but making one redundant isn't an option. Ideally we would have merged the data from both accounts and had one - however the research we have done points to this not being a possibility - unless one of you guys knows different? It would be great if anyone has experience on any this.. Thanks
Reporting & Analytics | | ChrisAllbones0 -
URL Formatting for Internal Link Tagging
After doing some research on internal campaign link tagging, I have seen conflicting viewpoints from analytics and SEO professionals regarding the most effective and SEO-friendly way to tag internal links for a large ecommerce site. It seems there are several common methods of tagging internal links, which can alter how Google interprets these links and indexes the URLs these links point to. Query Parameter - Using ? or & to separate a parameter like cid that will be appended to all internal-pointing links. Since Google will crawl and index these, I believe this method has the potential of causing duplicate content. Hash - Using # to separate a parameter like cid that will be appended to all internal-pointing links. Javascript - Using an onclick event to pass tracking data to your analytics platform Not Tagging Internal Links - While this method will provide the cleanest possible internal link paths for Google and users navigating the site and prevent duplicate content issues, analytics will be less effective. For those of you that manage SEO or analytics for large (1 million+ visits per month) ecommerce sites, what method do you employ and why? Edit* - For this discussion, I am only concerned with tagging links within the site that point to other pages within the same site - not links that come from outside the site or lead offsite. Thank you
Reporting & Analytics | | RobbieFoglia0 -
Google Analytics to sub domains
Hi, I have a site xyz.com and two separate sites on sub domains xyz.com/abc and xyz.com/def. What's the best way to go about with GA such that I can get all the data in the same place. Should I use the GA code for xyz.com on sub domains as well? Or should I create separate profiles?
Reporting & Analytics | | mayanksaxena0 -
Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
Greetings MOZ Community: On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851. The following changes occurred between June 5th and June 15th: -A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress. -Google GTM code was added to the site. -An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function. In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages. Obviously this is not a good situation. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time. Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this? Thanks everyone!!!
Reporting & Analytics | | Kingalan1
Alan1 -
Question about a Old Domain
My question is about a previously used Domain. recently one of my client buy a new domain and now start working with that. he ask me to manage google webmaster and analysis. Yesterday i found its showing 360 Not found error. and showing some link which is not generated by my client, maybe previously this domain was used and these link are generated at that time. in webmaster tools i use Remove URLs to remove these links from google. My question to experts: it will effect my clients site? can i overcome from this situation? or better to use a new domain? at opensiteexplorer its showing Domain authority is 6/100, Page authority 16/100
Reporting & Analytics | | Tahrim0 -
We have detected that the root domain is not a live URL.
I'm trying to add a URL that is having some obvious issues so I can further investigate. When trying to add this site to a campaign in SEOmoz i get the following: Roger has detected a problem: We have detected that the root domain theurbandater.com is not a live URL. Using this domain, we will be unable to crawl your site or present accurate SERP information. == What does that error mean? Where should I be looking to begin troubleshooting? The initial issue was that back on 9/1 according to Google Webmaster Tools this site began getting a high number of 500 erros and that number continued to rise up to 3200 of the same type of error. So something screwy is going on and I'm not sure where to start looking.
Reporting & Analytics | | digisavvy0