Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Disavow - Broken links
-
I have a client who dealt with an SEO that created not great links for their site.
When I drilled down in opensiteexplorer there are quite a few links where the sites do not exist anymore - so I thought I could test out Disavow out on them .. maybe just about 6 - then we are building good quality links to try and tackle this problem with a more positive approach.
I just wondered what the consensus was?
-
Thanks everyone.
Well this is an example http://www.hotmarketable4you.com/SpecialReports/PRE30141.html
I had checked lots of these links maybe 2 weeks ago and then had (poor) content on them - but now all seem to be broken so i suspect was a link farm.
And Mike - was more irrelevant rather than "bad" content
Think I'll build links over next few weeks and then evaluate where we are then - hopefully rankings will start to improve
-
I think that the better tactic would be to create new content for those broken links. Unless these links are located on a very bad domain (link farm, etc.), I would just create a new page.
Be careful before you start messing with the disavow tool. The only time I would use the disavow tool is if the link is obviously bad. Like obviously obviously bad (if that makes sense). Many people assume that their ranking tanked because of some algo update and start disavowing links without really checking into it. Just be careful before using that tool and research the hell out of the link before you throw it away.
Here is a good article that gives you the Do's and Don't of using the Disavow tool.
http://www.portent.com/blog/seo/google-disavow-links-tool-best-practices.htm
Good luck!
-
I think if the links are broken and Google has been made aware of such, ie it has recrawled and cached the page (simply add "cache:" in front of the URL for the last cache copy - if the URL itself is broken, check if it is still indexed in Google), then it would know that the link has been broken and shouldn't count it.
If that's the case, I don't think the disavow would have any benefit, unless of course if the link were to return, which could be a possibility.
If the page is cached and that cached version has got the broken version = no worries.
If the URL is broken and the page is no longer indexed = no worries.
If the URL is broken and still indexed = check to see if any other links point to that URL (including the URLs site navigation and/or sitemap, if applicable. If not, should deindex soon. If there are links, I'd disavow.
Just my two pennies, hope it helps!
-
links that don't exist or links to pages that don't exist?
..heck, either way i'd ignore them and focus on phase 2 of your plan. Disavow seems to be a bit overused in my opinion. It's more of a last-ditch effort for penalty recovery IMHO.
and if it's 404 errors you're trying to fix: Google will eventually stop following those after they 404 long enough. Don't even worry about it. (unless they're links you want, then put a relevant redirect in place.)
Hope this was helpful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Flurry of thousands of bad links from 3 Spammy websites. Disavow?
I also discovered that a website www.prlog.ru put 32 links to my website. It is a russian site. It has a 32% spam score. Is that high? I think I need to disavow. Another spammy website link has spam score of 16% with with several thousand links. I added one link to the site medexplorer.com 6 years ago and it was fine. Now it has thousands of links. Should I disavow all three?
White Hat / Black Hat SEO | | Boodreaux0 -
Disavow wn.com?
I am cleaning up some spammy backlinks for a client and will be submitting a disavow at Google. This particular company website has 2,000+ backlinks from the domain wn.com which appears to be "World News". If you go to it, it appears to be nothing more than scraped content from other sites. Here is a recent example, where my client is linked to (I don't even see the backlink on the page, but it is in the source code!):
White Hat / Black Hat SEO | | gbkevin
http://article.wn.com/view/2013/11/22/Hungarian_Woman_Sentenced_to_One_Year_in_Prison_for_Her_Role/#/related_news But when I look at Moz metrics, WN.com has a domain authority of 90! So I don't want to disavow something that could POTENTIALLY be helping us. The client's website gets zero traffic from wn.com and I've never seen my client linked to in anything worthwhile... it kinda looks spammy to me. If you were me, after looking at WN.com and taking everything into account... would you disavow it? This client really needs to create a healthier backlink profile. Thanks!0 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
White Hat / Black Hat SEO | | NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0 -
Should I Do a Social Bookmarking Campaign and a Tier 2 Linking?
I don't see anything bad in manually creating links on different (about 50) social bookmarking services. Is this method labeled as White Hat? I was wondering if it would be fine to create Tier 2 linking (probably blog comments) for indexing of the social bookmarking links? Please share your thoughts on the topic.
White Hat / Black Hat SEO | | zorsto0 -
Cross linking websites of the same company, is it a good idea
As a user I think it is beneficial because those websites are segmented to answer to each customer needs, so I wonder if I should continue to do it or avoid it as much as possible if it damages rankings...
White Hat / Black Hat SEO | | mcany0 -
Site being targeted by hardcore porn links
We noticed recently a huge amount of referral traffic coming to a client's site from various hard cord porn sites. One of the sites has become the 4th largest referrer and there are maybe 20 other sites sending traffic. I did a Whois look up on some of the sites and they're all registered to various people & companies, most of them are pretty shady looking. I don't know if the sites have been hacked or are deliberately sending traffic to my client's site, but it's obviously a concern. The client's site was compromised a few months ago and had a bunch of spam links inserted into the homepage code. Has anyone else seen this before? Any ideas why someone would do this, what the risks are and how we fix it? All help & suggestions greatly appreciated, many thanks in advance. MB.
White Hat / Black Hat SEO | | MattBarker0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0