Broken Inner Links - Tool Recommendations?
-
Do you have any recommendations for tools that scan an entire website and report broken inner links?
I run several UGC centered websites and broken inner links, and external, is an issue.
Being that these websites are several hundred thousand pages large, I am not really all that excited about running software on my desktop (xenu link sleuth for example). Any online solutions you could recommend would be great!
-
If it happens to be a wordpress site, there is a plugin called something like "Broken Link Checker." If I recall correctly, that checks internal and outbound links. Otherwise, not too sure.
-
Ideric, did any of these suggestions answer your questions, or have you been able to otherwise find a tool for this? I know others would find the information useful.
At a previous company, we had a custom-written solution to check external links, and made it check response headers until a 200 OK showed, or it got five levels deep. What we'd often find is that we'd have a 301 for an external link, and it'd go from non-www to www. Wouldn't necessarily worry about fixing that, but then later realized that from there, the www link was a 404, OR went to a 200 OK category landing page that said "we've reorganized our site, search here for that individual resource".
-
Well you've found the best solution right here at SEOmoz! Instead of wasting time learning new systems to find out if they'll work or not, just solve your problem. Sign with PRO Elite and you can crawl 100,000 pages.
-
I have used this in the past http://www.auditmypc.com/free-sitemap-generator.asp - (Click on the image in the top right of the instructions) a free tool for site map generation that will show broken internal links in the process. I don't think it has any limits to it, although I have not tried it on a site as large as you are suggesting. Just ensure you are not logged into your site when you run it. Although Google webmaster tools is ok, you can't verify changes made very quickly.
-
I think Xenu is your best option here. The size of the site nearly cuts out the chance a web tool could handle it.
Just recently on a site review I had to run Xenu on a site with 160,000 pages. It only took 4 hours running at 30 threads to complete. Any modern PC should handle it fine.
-
WMT is alright, apart from the fact you can't force Google to crawl all your pages. I would doubt that even a majority of the pages were crawled and indexed by Google (though I don't know what the site is).
Plus, as you say, it only deals with internal links and 404s coming in.
Do you know what the upper limit is on how many crawl errors WMT will display?
-
I might be wrong, but I think Google WMT can accomplish this with ease. I'm looking at 1000 right now. Externally you'll probably have to use xenu =/
-
You might be out of luck on a site that size.
I think WebCEO can do this with their online version but to get 100,000 urls crawled I think it'll cost you a bomb (the sort of money that it'd be cheaper to buy a second PC to run Xenu, lol).
Anyway - http://www.webceo.com/ - I think it may also be possible to install the download version to a server and run it that way.
-
I use Google webmaster tools. Go to diagnostics, then crawl errors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How are these links being displayed?
How does one markup their site to get the small sitelinks to appear in SERP listings as seen in the example image below? jJiQYy3
Technical SEO | | SelectHub0 -
Disavow links and domain of SPAM links
Hi, I have a big problem. For the past month, my company website has been scrape by hackers. This is how they do it: 1. Hack un-monitored and/or sites that are still using old version of wordpress or other out of the box CMS. 2. Created Spam pages with links to my pages plus plant trojan horse and script to automatically grab resources from my server. Some sites where directly uploaded with pages from my sites. 3. Pages created with title, keywords and description which consists of my company brand name. 4. Using http-referrer to redirect google search results to competitor sites. What I have done currently: 1. Block identified site's IP in my WAF. This prevented those hacked sites to grab resources from my site via scripts. 2. Reach out to webmasters and hosting companies to remove those affected sites. Currently it's not quite effective as many of the sites has no webmaster. Only a few hosting company respond promptly. Some don't even reply after a week. Problem now is: When I realized about this issue, there were already hundreds if not thousands of sites which has been used by the hacker. Literally tens of thousands of sites has been crawled by google and the hacked or scripted pages with my company brand title, keywords, description has already being index by google. Routinely everyday I am removing and disavowing. But it's just so much of them now indexed by Google. Question: 1. What is the best way now moving forward for me to resolve this? 2. Disavow links and domain. Does disavowing a domain = all the links from the same domain are disavow? 3. Can anyone recommend me SEO company which dealt with such issue before and successfully rectified similar issues? Note: SEAGM is company branded keyword 5CGkSYM.png
Technical SEO | | ahming7770 -
Sitemap links
Hi, I´m running a sitemap using pro-sitemaps and I find several pages that shouldn´t be listed. How do I find how are these pages being generated? Can´t find the links the robot is following to get to those pages..
Technical SEO | | ceci27100 -
Spam link? Links from linguee
Hi Everyone My site received a notification of unnatural links in Webmaster Tools and the site has had a penalty applied. I can see there are a lot of links from a site : linguee.com .de. nl. ect ..more than 30k of them! I am not sure where did those links come from! The suddenly appeared over the weekend. Does anyone has similar experience before and any suggestion? Thanks Ricky
Technical SEO | | SEO-SMB0 -
Linking root domains and youtube
All of my competitors have high linking root domains from youtube and our isn't showing up although we have 1.5 million views to youtube. I tried adding our URL to the videos but it hasn't recognized as a linking root domain. What should I do?? There's a ton of SEO juice here I want to tap into! watch?v=GTXFRTY4CCA&list=UUOcfF9LAHKedNSyk-gk5xDw&index=28
Technical SEO | | tonymartin0 -
My seo company has a footer link that links to my site by keyword will this effect my rankings
My old SEo company has a footer link by keyword to my site so it acts like a site wide link will this effect my rankings. My site was in the top 5 for many keywords now page 2 and 3 so I am trying to see what has effected it as we havent changed what we do
Technical SEO | | Casefun0 -
Google Webmaster tools error?
So I am trying to set the URL preference in google webmaster tools for my site. However when I try to save it it tells me to verify that I own the site. I have already done this so where can I go to verify I own the site exactly? Maybe I am wrong and I have not done this already but even on the homepage of webmaster tools I don't see an option to "verify".
Technical SEO | | ENSO0 -
Canonical Link for Duplicate Content
A client of ours uses some unique keyword tracking for their landing pages where they append certain metrics in a query string, and pulls that information out dynamically to learn more about their traffic (kind of like Google's UTM tracking). Non-the-less these query strings are now being indexed as separate pages in Google and Yahoo and are being flagged as duplicate content/title tags by the SEOmoz tools. For example: Base Page: www.domain.com/page.html
Technical SEO | | kchandler
Tracking: www.domain.com/page.html?keyword=keyword#source=source Now both of these are being indexed even though it is only one page. So i suggested placing an canonical link tag in the header point back to the base page to start discrediting the tracking URLs: But this means that the base pages will be pointing to themselves as well, would that be an issue? Is their a better way to solve this issue without removing the query tracking all togther? Thanks - Kyle Chandler0