Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How can I find all broken links pointing to my site?
-
I help manage a large website with over 20M backlinks and I want to find all of the broken ones. What would be the most efficient way to go about this besides exporting and checking each backlink's reponse code?
Thank you in advance!
-
To find all broken links pointing to your site, you can use various online tools such as Google Search Console, Ahrefs, or SEMrush. These tools allow you to analyze your website's backlink profile and identify any links that lead to pages returning 404 errors or other status codes indicating broken or inaccessible content. Additionally, you can manually check for broken links by reviewing your website's referral traffic, monitoring social media mentions, and conducting periodic audits of your site's content and backlinks.
-
To find all broken links pointing to your site, you can use online tools like Google Search Console's "Links to Your Site" report, which lists external pages linking to your site. Additionally, you can utilize website crawling tools such as Screaming Frog or Ahrefs' Site Explorer to identify broken links from external sources. Regularly monitoring and fixing broken links helps maintain website health, improves user experience, and enhances SEO performance.
-
You can find broken links pointing to your website by using website crawl tools like Screaming Frog or Ahrefs, checking crawl errors in Google Search Console, and monitoring your backlinks with tools like Ahrefs or SEMrush. Regularly checking your referral traffic and using online broken link checkers can also help you identify broken links.
-
You can find broken links pointing to your website by using website crawl tools like Screaming Frog or Ahrefs, checking crawl errors in Google Search Console, and monitoring your backlinks with tools like Ahrefs or SEMrush. Regularly checking your referral traffic and using online broken link checkers can also help you identify broken links.
-
We often use Moz Pro, its a fantastic SEO tool, we also use Screaming Frog as well, we use this to find any broken internal links.
this has helped improve our on-page seo, for our garden office company.
-
Ha, I feel silly. I do use ahrefs, but somehow the broken backlinks tool escaped me. This is perfect, thank you!
-
Hi Steven,
I assume many of these backlinks will be broken because pages were removed from your site without being properly redirected. If that is the case, Open Site Explorer's Link Opportunities (Link Reclamation) tool should be a big help. This will show all 404 URLs with inbound links that you can recapture be 301 redirecting. Additionally, you can look up the backlinks to each of these 404 pages and reach out to each webmaster requesting they update the URL of their link.
I've also had success exporting Top Pages reports (Moz or Majestic are my preferred tools for this), running any URL with a backlink to it through Screaming Frog and pulling 404 pages/broken links (or even 302 redirects) that way. I usually find additional opportunities that do not show up in the Link Reclamation report.
Hope this helps!
-
Use ahrefs and split the crawls for the main folders of the website. Actually, consider the priorities because then you don't have to do all of the 20m. Start with the main ones and go step by step for being able to crawl the majority.
-
I agree with Kevin. Ahref has that capability assuming you don't run into size constraints. Here's a quick post that explains where to find it. (See https://ahrefs.com/blog/turning-broken-links-site-powerful-links-ahrefs-broken-link-checker/.)
-
Have you looked into ahrefs? I know a ton of horsepower behind it, but don't know if it can handle checking 20m. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Redirecting one site to another for link juice
I have two sites with same theme - buying cars. I am going remove one of the sites from being crawled permenantly (ie junkthecars.com) and point domian via 301, to another similar theme site (sellthecars.com). The purpose is to simply pass the SEO link juice from one site to the other as we retire junkthecars.com.... Is a forwarding of the domain OK and the best way for the search engines to increase the rank of sellthecars.com (we hate to wast the link work done on Junkthecars.com)? What dangers should I look for that could hurt sellthecars.com if we do the redirect at a simple TLD?
Intermediate & Advanced SEO | | bestone0 -
Do 404 Pages from Broken Links Still Pass Link Equity?
Hi everyone, I've searched the Q&A section, and also Google, for about the past hour and couldn't find a clear answer on this. When inbound links point to a page that no longer exists, thus producing a 404 Error Page, is link equity/domain authority lost? We are migrating a large eCommerce website and have hundreds of pages with little to no traffic that have legacy 301 redirects pointing to their URLs. I'm trying to decide how necessary it is to keep these redirects. I'm not concerned about the page authority of the pages with little traffic...I'm concerned about overall domain authority of the site since that certainly plays a role in how the site ranks overall in Google (especially pages with no links pointing to them...perfect example is Amazon...thousands of pages with no external links that rank #1 in Google for their product name). Anyone have a clear answer? Thanks!
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Link Research Tools - Detox Links
Hi, I was doing a little research on my link profile and came across a tool called "LinkRessearchTools.com". I bought a subscription and tried them out. Doing the report they advised a low risk but identified 78 Very High Risk to Deadly (are they venomous?) links, around 5% of total and advised removing them. They also advised of many suspicious and low risk links but these seem to be because they have no knowledge of them so default to a negative it seems. So before I do anything rash and start removing my Deadly links, I was wondering if anyone had a). used them and recommend them b). recommend detoxing removing the deadly links c). would there be any cases in which so called Deadly links being removed cause more problems than solve. Such as maintaining a normal looking profile as everyone would be likely to have bad links etc... (although my thinking may be out on that one...). What do you think? Adam
Intermediate & Advanced SEO | | NaescentAdam0 -
Outbound Links to Authority sites
Will outbound links to a related topic on an authority site help, hurt or be irrelevanent for SEO purposes. And if beneficially, should it be Nofollow?
Intermediate & Advanced SEO | | VictorVC0 -
Can Affiliate Links Harm Your Rank?
Does Google interpret Affiliate links as paid links? If so, can Affiliate links harm your rank if they are not properly tagged with a no-follow? Thanks.
Intermediate & Advanced SEO | | AWCthreads0 -
Site Architecture: Cross Linking vs. Siloing
I'm curious to know what other mozzers think about silo's... Can we first all agree that a flat site architecture is the best practice? Relevant pages should be grouped together. Shorter, broader and (usually) therefore higher volume keywords should be towards the top of each category. Navigation should flow from general to specific. Agreed? As Google say's on page 10 of their SEO Starter Guide, "you should think about how visitors will go from a general page (your root page) to a page containing more specific content ." OK, we all agree so far, right? Great! Enter my question: Bruce Clay (among others) seem to recommend siloing as a best practice. While Richard Baxter (and many others @ SEOmoz), seem to view silos as a problem. Me? I've practiced (relevant) internal cross linking, and have intentionally avoided siloing in almost all cases. What about you? Is there a time and place to use silos? If so, when and where? If not, how do we rectify the seemingly huge differences of opinions between expert folks such as Baxter and Clay?
Intermediate & Advanced SEO | | DonnieCooper7