Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Disavow - Broken links
-
I have a client who dealt with an SEO that created not great links for their site.
When I drilled down in opensiteexplorer there are quite a few links where the sites do not exist anymore - so I thought I could test out Disavow out on them .. maybe just about 6 - then we are building good quality links to try and tackle this problem with a more positive approach.
I just wondered what the consensus was?
-
Thanks everyone.
Well this is an example http://www.hotmarketable4you.com/SpecialReports/PRE30141.html
I had checked lots of these links maybe 2 weeks ago and then had (poor) content on them - but now all seem to be broken so i suspect was a link farm.
And Mike - was more irrelevant rather than "bad" content
Think I'll build links over next few weeks and then evaluate where we are then - hopefully rankings will start to improve
-
I think that the better tactic would be to create new content for those broken links. Unless these links are located on a very bad domain (link farm, etc.), I would just create a new page.
Be careful before you start messing with the disavow tool. The only time I would use the disavow tool is if the link is obviously bad. Like obviously obviously bad (if that makes sense). Many people assume that their ranking tanked because of some algo update and start disavowing links without really checking into it. Just be careful before using that tool and research the hell out of the link before you throw it away.
Here is a good article that gives you the Do's and Don't of using the Disavow tool.
http://www.portent.com/blog/seo/google-disavow-links-tool-best-practices.htm
Good luck!
-
I think if the links are broken and Google has been made aware of such, ie it has recrawled and cached the page (simply add "cache:" in front of the URL for the last cache copy - if the URL itself is broken, check if it is still indexed in Google), then it would know that the link has been broken and shouldn't count it.
If that's the case, I don't think the disavow would have any benefit, unless of course if the link were to return, which could be a possibility.
If the page is cached and that cached version has got the broken version = no worries.
If the URL is broken and the page is no longer indexed = no worries.
If the URL is broken and still indexed = check to see if any other links point to that URL (including the URLs site navigation and/or sitemap, if applicable. If not, should deindex soon. If there are links, I'd disavow.
Just my two pennies, hope it helps!
-
links that don't exist or links to pages that don't exist?
..heck, either way i'd ignore them and focus on phase 2 of your plan. Disavow seems to be a bit overused in my opinion. It's more of a last-ditch effort for penalty recovery IMHO.
and if it's 404 errors you're trying to fix: Google will eventually stop following those after they 404 long enough. Don't even worry about it. (unless they're links you want, then put a relevant redirect in place.)
Hope this was helpful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitewide nav linking from subdomain to main domain
I'm working on a site that was heavily impacted by the September core update. You can see in the attached image the overall downturn in organic in 2019 with a larger hit in September bringing Google Organic traffic down around 50%. There are many concerning incoming links from 50-100 obviously spammy porn-related websites to just plain old unnatural links. There was no effort to purchase any links so it's unclear how these are created. There are also 1,000s of incoming external links (most without no-follow and similar/same anchor text) from yellowpages.com. I'm trying to get this fixed with them and have added it to the disavow in the meantime. I'm focusing on internal links as well with a more specific question: If I have a sitewide header on a blog located at blog.domain.com that has links to various sections on domain.com without no-follow tags, is this a possible source of the traffic drops and algorithm impact? The header with these links is on every page of the blog on the previously mentioned subdomain. **More generally, any advice as to how to turn this around? ** The website is in the travel vertical. 90BJKyc
White Hat / Black Hat SEO | | ShawnW0 -
Too many dofollow links = penalty?
Hi. I currently have 150 backlinks, 90% of them are dofollow, while only 10% are nofollow. I recently hit position #10 for my main keyword, but now it is dropped to #16 and a lot of related keywords are gone. So I have a few questions: 1. Was my website penalized for having an unnatural backlink profile (too many dofollow links), or maybe this drop in positions is just a temporary, natural thing? 2. Isn’t it too late for making the backlink profile look more natural by building more nofollow backlinks and making it 50%/50%? Thank you!
White Hat / Black Hat SEO | | NathalieBr0 -
How many links can you have on sitemap.html
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
White Hat / Black Hat SEO | | imjonny0 -
Dealing with links to your domain that the previous owner set up
Hey everyone, I rebranded my company at the end of last year from a name that was fairly unique but sounded like I cleaned headstones instead of building websites. I opted for a name that I liked, it reflected my heritage - however it also seems to be quite common. Anyway, I registered the domain name as it was available as the previous owner's company had been wound up. It's only been in the last week or two where I've managed to have a website on that domain and I've been tracking it's progress through Moz, Google & Bing Webmaster tools. Both the webmaster tools are reporting back that my site triggers 404 errors for some specific links. However, I don't have or have never used those links before. I think the previous owner might have created the links before he went bust. My question is in two parts. The first part is how do I find out what websites are linking to me with these broken URL's, and the second is will these 404'ing links affect my SEO? Thanks!
White Hat / Black Hat SEO | | mickburkesnr0 -
Pages linked with Spam been 301 redirected to 404\. Is it ok
Pl suggest, some pages having some spam links pointed to those pages are been redirected to 404 error page (through 301 redirect) - as removing them manually was not possible due to part of core component of cms and many other coding issue, the only way as advised by developer was making 301 redirect to 404 page. Does by redirecting these pages to 404 page using 301 redirect, will nullify all negative or spam links pointing to them and eventually will remove the resulting spam impact on the site too. Many Thanks
White Hat / Black Hat SEO | | Modi0 -
Footer Link in International Parent Company Websites Causing Penalty?
Still waiting to look at the analytics for the timeframe, but we do know that the top keyword dropped on or about April 23, 2012 from the #1 ranking in Google - something they had held for years, and traffic dropped over 15% that month and further slips since. Just looked at Google Webmaster Tools and see over 2.3MM backlinks from "sister" compainies from their footers. One has over 700,000, the rest about 50,000 on average and all going to the home page, and all using the same anchor text, which is both a branded keyword, as well as a generic keyword, the same one they ranked #1 for. They are all "nofollows" but we are trying to confirm if the nofollow was before or after they got hit, but regardless, Google has found them. To also add, most of sites are from their international sites, so .de, .pl, .es, .nl and other Eurpean country extensions. Of course based on this, I would assume the footer links and timing, was result of the Penguin update and spam. The one issue, is that the other US "sister" companies listed in the same footer, did not see a drop, in fact some had increase traffic. And one of them has the same issue with the brand name, where it is both a brand name and a generic keyword. The only note that I will make about any of the other domains is that they do not drive the traffic this one used to. There is at least a 100,000+ visitor difference among the main site, and this additional sister sites also listed in the footer. I think I'm on the right track with the footer links, even though the other sites that have the same footer links do not seem to be suffering as much, but wanted to see if anyone else had a different opinion or theory. Thanks!
White Hat / Black Hat SEO | | LeverSEO
Jen Davis0 -
Link Building using Badges
In light of penguin update, is link building using badges(like "I love SEOMOZ" badge) still considered a white hat tactic? I have read old posts on SEOMOZ blog about this topic and wondering if this method is still effective. Look forward to feedback from MOZers.
White Hat / Black Hat SEO | | Amjath0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0