Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How can I find all broken links pointing to my site?
-
I help manage a large website with over 20M backlinks and I want to find all of the broken ones. What would be the most efficient way to go about this besides exporting and checking each backlink's reponse code?
Thank you in advance!
-
To find all broken links pointing to your site, you can use various online tools such as Google Search Console, Ahrefs, or SEMrush. These tools allow you to analyze your website's backlink profile and identify any links that lead to pages returning 404 errors or other status codes indicating broken or inaccessible content. Additionally, you can manually check for broken links by reviewing your website's referral traffic, monitoring social media mentions, and conducting periodic audits of your site's content and backlinks.
-
To find all broken links pointing to your site, you can use online tools like Google Search Console's "Links to Your Site" report, which lists external pages linking to your site. Additionally, you can utilize website crawling tools such as Screaming Frog or Ahrefs' Site Explorer to identify broken links from external sources. Regularly monitoring and fixing broken links helps maintain website health, improves user experience, and enhances SEO performance.
-
You can find broken links pointing to your website by using website crawl tools like Screaming Frog or Ahrefs, checking crawl errors in Google Search Console, and monitoring your backlinks with tools like Ahrefs or SEMrush. Regularly checking your referral traffic and using online broken link checkers can also help you identify broken links.
-
You can find broken links pointing to your website by using website crawl tools like Screaming Frog or Ahrefs, checking crawl errors in Google Search Console, and monitoring your backlinks with tools like Ahrefs or SEMrush. Regularly checking your referral traffic and using online broken link checkers can also help you identify broken links.
-
We often use Moz Pro, its a fantastic SEO tool, we also use Screaming Frog as well, we use this to find any broken internal links.
this has helped improve our on-page seo, for our garden office company.
- topic:timeago_earlier,7 years
-
Ha, I feel silly. I do use ahrefs, but somehow the broken backlinks tool escaped me. This is perfect, thank you!
-
Hi Steven,
I assume many of these backlinks will be broken because pages were removed from your site without being properly redirected. If that is the case, Open Site Explorer's Link Opportunities (Link Reclamation) tool should be a big help. This will show all 404 URLs with inbound links that you can recapture be 301 redirecting. Additionally, you can look up the backlinks to each of these 404 pages and reach out to each webmaster requesting they update the URL of their link.
I've also had success exporting Top Pages reports (Moz or Majestic are my preferred tools for this), running any URL with a backlink to it through Screaming Frog and pulling 404 pages/broken links (or even 302 redirects) that way. I usually find additional opportunities that do not show up in the Link Reclamation report.
Hope this helps!
-
Use ahrefs and split the crawls for the main folders of the website. Actually, consider the priorities because then you don't have to do all of the 20m. Start with the main ones and go step by step for being able to crawl the majority.
-
I agree with Kevin. Ahref has that capability assuming you don't run into size constraints. Here's a quick post that explains where to find it. (See https://ahrefs.com/blog/turning-broken-links-site-powerful-links-ahrefs-broken-link-checker/.)
-
Have you looked into ahrefs? I know a ton of horsepower behind it, but don't know if it can handle checking 20m. Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breaking up a site into multiple sites
Hi, I am working on plan to divide up mid-number DA website into multiple sites. So the current site's content will be divided up among these new sites. We can't share anything going forward because each site will be independent. The current homepage will change to just link out to the new sites and have minimal content. I am thinking the websites will take a hit in rankings but I don't know how much and how long the drop will last. I know if you redirect an entire domain to a new domain the impact is negligible but in this case I'm only redirecting parts of a site to a new domain. Say we rank #1 for "blue widget" on the current site. That page is going to be redirected to new site and new domain. How much of a drop can we expect? How hard will it be to rank for other new keywords say "purple widget" that we don't have now? How much link juice can i expect to pass from current website to new websites? Thank you in advance.
Intermediate & Advanced SEO | Aug 31, 2018, 7:21 AM | timdavis0 -
Can Google Bot View Links on a Wix Page?
Hi, The way Wix is configured you can't see any of the on-page links within the source code. Does anyone know if Google Bots still count the links on this page? Here is the page in question: https://www.ncresourcecenter.org/business-directory If you do think Google counts these links, can you please send me URL fetcher to prove that the links are crawlable? Thank you SO much for your help.
Intermediate & Advanced SEO | Nov 7, 2017, 6:55 PM | Fiyyazp0 -
Moving to a new site while keeping old site live
For reasons I won't get into here, I need to move most of my site to a new domain (DOMAIN B) while keeping every single current detail on the old domain (DOMAIN A) as it is. Meaning, there will be 2 live websites that have mostly the same content, but I want the content to appear to search engines as though it now belongs to DOMAIN B. Weird situation. I know. I've run around in circles trying to figure out the best course of action. What do you think is the best way of going about this? Do I simply point DOMAIN A's canonical tags to the copied content on DOMAIN B and call it good? Should I ask sites that link to DOMAIN A to change their links to DOMAIN B, or start fresh and cut my losses? Should I still file a change of address with GWT, even though I'm not going to 301 redirect anything?
Intermediate & Advanced SEO | Dec 8, 2014, 6:25 PM | kdaniels0 -
Using both dofollow & nofollow links within the same blog site (but different post).
Hi all, I have been actively pursuing bloggers for my site in order to build page rank. My website sells women undergarments that are more on the exotic end. I noticed a large amount of prospective bloggers demand product samples. As already confirm, bloggers that are given "free" samples should use a rel=no follow attribute in their links. Unfortunately this does not build my page rank or transfer links juice. My question is this: is it advisable for them to also blog additional posts and include dofollow links? The idea is for the blogger to use a nofollow when posting about the sample and a regular link for a secondary post at a later time. What are you thoughts concerning this matter?
Intermediate & Advanced SEO | Nov 20, 2013, 5:22 PM | 90miLLA0 -
Link Juice + multiple links pointing to the same page
Scenario
Intermediate & Advanced SEO | Sep 19, 2013, 7:01 AM | Mark_Ch
The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links. Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark0 -
Should I 'nofollow' links between my own sites?
We have five sites which are largely unrelated but for cross-promotional purpose our company wishes to cross link between all our sites, possibly in the footer. I have warned about potential consequences of cross-linking in this way and certainly don't want our sites to be viewed as some sort of 'link ring' if they all link to one another. Just wondering if linking between sites you own really is that much of an issue and whether we should 'nofollow' the links in order to prevent being slapped with any sort of penalty for cross-linking.
Intermediate & Advanced SEO | Jan 28, 2013, 2:40 PM | simon_realbuzz0 -
Removed Site-wide links
Hi there, I have recently removed quite a lot of site-wide links leaving the only link on homepage's of some websites, since doing this I have seen a dramatic drop on my keywords, going from position 2-3 to nowhere. Has anyone else experienced anything like this, should I expect to see a return on these keywords? Thanks
Intermediate & Advanced SEO | May 31, 2012, 11:57 AM | Paul780 -
Site Architecture: Cross Linking vs. Siloing
I'm curious to know what other mozzers think about silo's... Can we first all agree that a flat site architecture is the best practice? Relevant pages should be grouped together. Shorter, broader and (usually) therefore higher volume keywords should be towards the top of each category. Navigation should flow from general to specific. Agreed? As Google say's on page 10 of their SEO Starter Guide, "you should think about how visitors will go from a general page (your root page) to a page containing more specific content ." OK, we all agree so far, right? Great! Enter my question: Bruce Clay (among others) seem to recommend siloing as a best practice. While Richard Baxter (and many others @ SEOmoz), seem to view silos as a problem. Me? I've practiced (relevant) internal cross linking, and have intentionally avoided siloing in almost all cases. What about you? Is there a time and place to use silos? If so, when and where? If not, how do we rectify the seemingly huge differences of opinions between expert folks such as Baxter and Clay?
Intermediate & Advanced SEO | Dec 9, 2011, 5:45 PM | DonnieCooper7