Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to find and fix 404 and broken links?
-
Hi, My campaign is showing me many 404 problems and other tools are also showing me broken links, but the links they show me dose work and I cant seem to find the broken links or the cause of the 404.
Can you help?
-
Hi Yoseph,
If you liked the broken link checker Chrome plugin then you could check out another Chrome plugin that the company I work for created. It's called Redirect Path and I use it all the time. It's a handy header and redirect checker that flags up any 301, 302, 404 & 500 errors on any page you visit.
Hope that helps!
-
Only a plus to Adam's response: remember to design custom 404 pages. It's a good idea to include links to pages you want to rank, a search box or a list of featured pages (post, categories, etc.) At least, it may help to keep users staying in your website.
Hope it helps.
Sergio.
-
Adam, any more good links and tools that you can share with me? I see you know "something" about SEO and web-building...
-
No problem. Happy to help.
-
Thanks Adam, Great help!!!!
You made my life much easier!
-
Hi Yoseph,
To find broken links I like to use the Check My Links plugin for Chrome.
As for the 404 errors. I believe there is a way to view which pages are linking to them using the SEOMoz tools but I prefer to just use the Screaming Frog spider. Once you have crawled the site with this tool, all you need to do is select the response codes tab at the top, filter by Client Error (4xx) then click on the links and select the In Links tab at the bottom to see the linking page.
Hope that helps,
Adam.
-
Thanks for your time.
I just cant find any broken links... I don't understand what you answered
-
I think you need to post the URL! Have you tried the links from every page? Although personally I would use an include file to reduce errors and maintenance time, but not sure if you are...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Broken canonical link errors
Hello, Several tools I'm using are returning errors due to "broken canonical links". However, I'm not too sure why is that. Eg.
Technical SEO | | GhillC
Page URL: domain.com/page.html?xxxx
Canonical link URL: domain.com/page.html
Returns an error. Any idea why? Am I doing it wrong? Thanks,
G1 -
How can I stop a tracking link from being indexed while still passing link equity?
I have a marketing campaign landing page and it uses a tracking URL to track clicks. The tracking links look something like this: http://this-is-the-origin-url.com/clkn/http/destination-url.com/ The problem is that Google is indexing these links as pages in the SERPs. Of course when they get indexed and then clicked, they show a 400 error because the /clkn/ link doesn't represent an actual page with content on it. The tracking link is set up to instantly 301 redirect to http://destination-url.com. Right now my dev team has blocked these links from crawlers by adding Disallow: /clkn/ in the robots.txt file, however, this blocks the flow of link equity to the destination page. How can I stop these links from being indexed without blocking the flow of link equity to the destination URL?
Technical SEO | | UnbounceVan0 -
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
:443 - 404 error
I get strange :443 errors in my 404 monitor on Wordpress https://www.compleetverkleed.nl:443/hoed-al-capone-panter-8713647758068-2/
Technical SEO | | Happy-SEO
https://www.compleetverkleed.nl:443/cart/www.compleetverkleed.nl/feestkleding
https://www.compleetverkleed.nl:443/maskers/ I have no idea where these come from :S2 -
How to fix keyword cannibalization?
Hello All, I am a webmaster of http://www.bannerbuzz.com and I have some problem in keyword cannibalization in my store, i have lots of categories related to banners, and in Banner section my my keyword is vinyl banners and my all categories urls structure connected with vinyl banners, I am definitely sure that keyword cannibalization issue in my website and i want to resolve it as soon as possible, so can anyone please help me how can i resolve this issue with effective way or without affected my keyword ranking? My Keywords Vinyl Banners : http://www.bannerbuzz.com/full-color-vinyl-banners.html
Technical SEO | | CommercePundit
Custom Banners : http://www.bannerbuzz.com/custom-vinyl-banners.html
Outdoor Banners : http://www.bannerbuzz.com/outdoor-vinyl-banners.html My 1 keyword vinyl banners is affected, so can anyone please look at these pages and let me know how can i resolve keyword cannibalization from my website.? Thanks
BannerBuzz.com0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
What is link Schemes?
Hello Friends, Today I am reading about link schemes on http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66356 there are a several ways how to avoid Google penalties and also talk about the low quality links. But I can't understand about "Low-quality directory or bookmark site links" Is there he talked about low page rank, Alexa or something else?
Technical SEO | | KLLC0 -
Best free tool to check internal broken links
Question says it all I guess. What would your recommend as the best free tool to check internal broken links?
Technical SEO | | RikkiD225