Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can we retrieve all 404 pages of my site?
-
Hi,
Can we retrieve all 404 pages of my site?
is there any syntax i can use in Google search to list just pages that give 404?
Tool/Site that can scan all pages in Google Index and give me this report.
Thanks
-
The 404s in webmaster tools relate to crawl errors. As such they will only appear if internally linked. It also limits the report to the top 1000 pages with errors only.
-
Set up a webmaster tools account for your site. You should be able to see all the 404 error urls.
-
I wouldn't try to manually remove that number of URLs. Mass individual removals can cause their own problems.
If the pages are 404ing correctly, then they will be removed. However it is a slow process. For the number you are looking at it will mostly likely take months. Google has to recrawl all of the URLs before it even knows that they are returning a 404 status. It will then likely wait a while and do it again before removing then. That's a painful truth and there really is not anything much you can do about it.
It might (and this is very arguable) be worth ensuring that there is a crawl path to the 404 content. So maybe a link from a high authority page to a "recently removed content" list that contains links to a selection and keep replacing that list. This will help that content get recrawled more quickly, but it will also mean that you are linking to 404 pages which might send quality signal issues. Something to weigh up.
What would work more quickly is to mass remove in particular directories (if you are lucky enough that some of your content fits that pattern). If you have a lot of urls in mysite.com/olddirectory and there is definitely nothing you want to keep in that directory then you can lose big swathes of URLs in one hit - see here: https://support.google.com/webmasters/answer/1663427?hl=en
Unfortunately that is only good for directories, not wildcards. However it's very helpful when it is an option.
So, how to find those URLs? (Your original question!!).
Unfortunately there is no way to get them all back from google. Even if you did a search for site:www.mysite.com and saved all of the results it will not return the number of results that you are looking for.
I tend to do this by looking for patterns and removing those to find more patterns. I'll try to explain:
- Search for site:www.yoursite.com
- Scroll down the list until you start seeing a pattern. (eg mysite.com/olddynamicpage-111.php , mysite.com/olddynamicpage-112.php , mysite.com/olddynamicpage-185.php etc) .
- Note that pattern (return later to check that they all return a 404 )
- Now search again with that pattern removed, site:www.mysite.com -inurl:olddynamicpage
- Return to step 2
Do this (a lot) and you start understanding the pattern that have been picked up. There are usually a few that account for large number of the incorrectly indexed URLs. In the recent problem I did they were almost all relating to "faceted search gone wrong".
Once you know the patterns you can check that the correct headers are being returned so that they start dropping out of the index. If any are directory patterns then you can remove than in big hits through GWMT.
It's painful. It's slow, but it does work.
-
Yes you need right at the same time to know which of the google indexed ones are 404
As google does not remove the dead 404 pages for months and was thinking to manually add them for removal in webmaster tools but need to find all of them that are indexed but 404
-
OK - that is a bit of a different problem (and a rather familiar one). So the aim is to figure out what the 330 "phantom" pages are and then how to remove them?
Let me know if I have that right. If I have then I'll give you some tips based on me doing to same with a few million URLs recently. I'll check first though, as it might get long!
-
Thanks you
I will try explaining my query again and you can correct me if the above is the solution again
1. My site has 70K pages
2. Google has indexed 500K pages from the site
Site:mysitename shows this
We have noindexed etc on most of them which is got down the counts to 300K
Now i want to find the pages that show 404 for our site checking the 300K pages
Webmaster shows few hundred as 404 but am sure there are many more
Can we scan the index rather then the site to find the ones Google search engine has indexed that are 404
-
As you say, on site crawlers such as Xenu & Screaming frog will only tell you when you are linking to 404 pages, not where people are linking to your 404 pages.
There are a few ways you can get to this data:
Your server logs : All 404 errors will be recorded on your server. If someone links to a non-existent page and that link is ever followed by a single user or a crawler like google-bot, that will be recorded in your server log files. You can access those directly (or pull 404s out of them on a regular, automatic basis). Alternatively most hosting comes with some form of log analysis built in (awstats being one of the most common). That will show you the 404 errors.
That isn't quite what you asked, as it doesn't mean that they have all been indexed, however that will be an exhaustive list that you can then check against.
Check that backlinks resolve : Download all of your backlinks (OSE, webmaster tools, ahreafs, majestic), look at the target and see what header is returned. We use a custom build tools called linkwatchman to do this on an automatic regular basis. However as an occasional check you can download in to excel and use the excellent SEO Tools for excel to do this for free. ( http://nielsbosma.se/projects/seotools/ <- best seo tool around)
Analytics : As long as your error pages trigger the google analytics tracking code you can get the data from here as well. Most helpful when the page either triggers a custom variable, or uses a virtual url ( 404/requestedurl.html for instance). Isolate the pages and look at where the traffic came from.
-
It will scan and list you all results, like 301 redirect, 200, 404 errors, 403 errors. However, screaming frog can spider upto 500 urls in there free product
If you have more, suggest to go with Xenu Link Sleuth. Download it, get your site crawled and get all pages including server error 404 to unlimited pages.
-
Thanks but this would be scanning pages in my site. How will i find 404 pages that are indexed in Google?
-
Hey there
Screaming Frog is a great (and free!) tool that lets you do this. You can download it here
Simply insert your URL and it will spider all of the URLs it can find for your site. It will then serve up a ton of information about the page, including whether it is a 200, 404, 301 or so on. You can even export this information into excel for easy filtering.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My product category pages are not being indexed on google can someone help?
My website has been indexed on google and all of its pages can be found on google except for the product category pages - which are where we want our traffic heading to, so this is a big problem for us. Our website is www.skirtinguk.com And an example of a page that isn't being indexed is https://www.skirtinguk.com/product-category/mdf-skirting-board/
Intermediate & Advanced SEO | | chelseaskirtinguk0 -
How can I find all broken links pointing to my site?
I help manage a large website with over 20M backlinks and I want to find all of the broken ones. What would be the most efficient way to go about this besides exporting and checking each backlink's reponse code? Thank you in advance!
Intermediate & Advanced SEO | | StevenLevine3 -
Adding hreflang tags - better on each page, or the site map?
Hello, I am wondering if there seems to be a preference for adding hreflang tags (from this article). My client just changed their site from gTLDs to ccTLDs, and a few sites have taken a pretty big traffic hit. One issue is definitely the amount of redirects to the page, but I am also going to work with the developer to add hreflang tags. My question is - is it better to add them to the header of each page, or the site map, or both, or something else? Any other thoughts are appreciated. Our Australia site, which was at least findable using Australia Google before this relaunch, is not showing up, even when you search the company name directly. Thanks!Lauryn
Intermediate & Advanced SEO | | john_marketade0 -
When should you 410 pages instead of 404
Hi All, We have approx 6,000 - 404 pages. These are for categories etc we don't do anymore and there is not near replacement etc so basically no reason or benefit to have them at all. I can see in GWT , these are still being crawled/found and therefore taking up crawler bandwidth. Our SEO agency said we should 410 these pages?.. I am wondering what the difference is and how google treats them differently ?. Do anyone know When should you 410 pages instead of 404 ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Do search engines crawl links on 404 pages?
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
Intermediate & Advanced SEO | | brad-causes0 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Max # of Products / Links per Page on E-Commerce Site
We are getting ready to re-launch our e-commerce site and are trying to decide how many products to list per category page. Some of of our category pages have upwards of 100 products. While I'd love to list ALL the products on the root category page (to reduce hassle for customer, to index more products on a higher PR page), I'm a little worried about having it be too long, and containing too many on-page links. Would love some guidance on: Maximum number of internal links on a page If Google frowns on really long category pages Anything else I should be considering when making this decision Thanks for your input!
Intermediate & Advanced SEO | | AndrewY2