Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can we retrieve all 404 pages of my site?
-
Hi,
Can we retrieve all 404 pages of my site?
is there any syntax i can use in Google search to list just pages that give 404?
Tool/Site that can scan all pages in Google Index and give me this report.
Thanks
-
The 404s in webmaster tools relate to crawl errors. As such they will only appear if internally linked. It also limits the report to the top 1000 pages with errors only.
-
Set up a webmaster tools account for your site. You should be able to see all the 404 error urls.
-
I wouldn't try to manually remove that number of URLs. Mass individual removals can cause their own problems.
If the pages are 404ing correctly, then they will be removed. However it is a slow process. For the number you are looking at it will mostly likely take months. Google has to recrawl all of the URLs before it even knows that they are returning a 404 status. It will then likely wait a while and do it again before removing then. That's a painful truth and there really is not anything much you can do about it.
It might (and this is very arguable) be worth ensuring that there is a crawl path to the 404 content. So maybe a link from a high authority page to a "recently removed content" list that contains links to a selection and keep replacing that list. This will help that content get recrawled more quickly, but it will also mean that you are linking to 404 pages which might send quality signal issues. Something to weigh up.
What would work more quickly is to mass remove in particular directories (if you are lucky enough that some of your content fits that pattern). If you have a lot of urls in mysite.com/olddirectory and there is definitely nothing you want to keep in that directory then you can lose big swathes of URLs in one hit - see here: https://support.google.com/webmasters/answer/1663427?hl=en
Unfortunately that is only good for directories, not wildcards. However it's very helpful when it is an option.
So, how to find those URLs? (Your original question!!).
Unfortunately there is no way to get them all back from google. Even if you did a search for site:www.mysite.com and saved all of the results it will not return the number of results that you are looking for.
I tend to do this by looking for patterns and removing those to find more patterns. I'll try to explain:
- Search for site:www.yoursite.com
- Scroll down the list until you start seeing a pattern. (eg mysite.com/olddynamicpage-111.php , mysite.com/olddynamicpage-112.php , mysite.com/olddynamicpage-185.php etc) .
- Note that pattern (return later to check that they all return a 404 )
- Now search again with that pattern removed, site:www.mysite.com -inurl:olddynamicpage
- Return to step 2
Do this (a lot) and you start understanding the pattern that have been picked up. There are usually a few that account for large number of the incorrectly indexed URLs. In the recent problem I did they were almost all relating to "faceted search gone wrong".
Once you know the patterns you can check that the correct headers are being returned so that they start dropping out of the index. If any are directory patterns then you can remove than in big hits through GWMT.
It's painful. It's slow, but it does work.
-
Yes you need right at the same time to know which of the google indexed ones are 404
As google does not remove the dead 404 pages for months and was thinking to manually add them for removal in webmaster tools but need to find all of them that are indexed but 404
-
OK - that is a bit of a different problem (and a rather familiar one). So the aim is to figure out what the 330 "phantom" pages are and then how to remove them?
Let me know if I have that right. If I have then I'll give you some tips based on me doing to same with a few million URLs recently. I'll check first though, as it might get long!
-
Thanks you
I will try explaining my query again and you can correct me if the above is the solution again
1. My site has 70K pages
2. Google has indexed 500K pages from the site
Site:mysitename shows this
We have noindexed etc on most of them which is got down the counts to 300K
Now i want to find the pages that show 404 for our site checking the 300K pages
Webmaster shows few hundred as 404 but am sure there are many more
Can we scan the index rather then the site to find the ones Google search engine has indexed that are 404
-
As you say, on site crawlers such as Xenu & Screaming frog will only tell you when you are linking to 404 pages, not where people are linking to your 404 pages.
There are a few ways you can get to this data:
Your server logs : All 404 errors will be recorded on your server. If someone links to a non-existent page and that link is ever followed by a single user or a crawler like google-bot, that will be recorded in your server log files. You can access those directly (or pull 404s out of them on a regular, automatic basis). Alternatively most hosting comes with some form of log analysis built in (awstats being one of the most common). That will show you the 404 errors.
That isn't quite what you asked, as it doesn't mean that they have all been indexed, however that will be an exhaustive list that you can then check against.
Check that backlinks resolve : Download all of your backlinks (OSE, webmaster tools, ahreafs, majestic), look at the target and see what header is returned. We use a custom build tools called linkwatchman to do this on an automatic regular basis. However as an occasional check you can download in to excel and use the excellent SEO Tools for excel to do this for free. ( http://nielsbosma.se/projects/seotools/ <- best seo tool around)
Analytics : As long as your error pages trigger the google analytics tracking code you can get the data from here as well. Most helpful when the page either triggers a custom variable, or uses a virtual url ( 404/requestedurl.html for instance). Isolate the pages and look at where the traffic came from.
-
It will scan and list you all results, like 301 redirect, 200, 404 errors, 403 errors. However, screaming frog can spider upto 500 urls in there free product
If you have more, suggest to go with Xenu Link Sleuth. Download it, get your site crawled and get all pages including server error 404 to unlimited pages.
-
Thanks but this would be scanning pages in my site. How will i find 404 pages that are indexed in Google?
-
Hey there
Screaming Frog is a great (and free!) tool that lets you do this. You can download it here
Simply insert your URL and it will spider all of the URLs it can find for your site. It will then serve up a ton of information about the page, including whether it is a 200, 404, 301 or so on. You can even export this information into excel for easy filtering.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Stats Decline After Site Launch (Pages Crawled Per Day, KB Downloaded Per Day)
Hi all, I have been looking into this for about a month and haven't been able to figure out what is going on with this situation. We recently did a website re-design and moved from a separate mobile site to responsive. After the launch, I immediately noticed a decline in pages crawled per day and KB downloaded per day in the crawl stats. I expected the opposite to happen as I figured Google would be crawling more pages for a while to figure out the new site. There was also an increase in time spent downloading a page. This has went back down but the pages crawled has never went back up. Some notes about the re-design: URLs did not change Mobile URLs were redirected Images were moved from a subdomain (images.sitename.com) to Amazon S3 Had an immediate decline in both organic and paid traffic (roughly 20-30% for each channel) I have not been able to find any glaring issues in search console as indexation looks good, no spike in 404s, or mobile usability issues. Just wondering if anyone has an idea or insight into what caused the drop in pages crawled? Here is the robots.txt and attaching a photo of the crawl stats. User-agent: ShopWiki Disallow: / User-agent: deepcrawl Disallow: / User-agent: Speedy Disallow: / User-agent: SLI_Systems_Indexer Disallow: / User-agent: Yandex Disallow: / User-agent: MJ12bot Disallow: / User-agent: BrightEdge Crawler/1.0 (crawler@brightedge.com) Disallow: / User-agent: * Crawl-delay: 5 Disallow: /cart/ Disallow: /compare/ ```[fSAOL0](https://ibb.co/fSAOL0)
Intermediate & Advanced SEO | | BandG0 -
Why some websites can rank the keywords they don't have in the page?
Hello guys, Yesterday, I used SEMrush to search for the keyword "branding agency" to see the SERP. The Liquidagency ranks 5th on the first page. So I went to their homepage but saw no exact keywords "branding agency", even in the page source. Also, I didn't see "branding agency" as a top anchor text in the external links to the page (from the report of SEMrush). I am an SEO newbie, can someone explain this to me, please? Thank you.
Intermediate & Advanced SEO | | Raymondlee0 -
Is it bad for SEO to have a page that is not linked to anywhere on your site?
Hi, We had a content manager request to delete a page from our site. Looking at the traffic to the page, I noticed there were a lot of inbound links from credible sites. Rather than deleting the page, we simply removed it from the navigation, so that a user could still access the page by clicking on a link to it from an external site. Questions: Is it bad for SEO to have a page that is not directly accessible from your site? If no: do we keep this page in our Sitemap, or remove it? If yes: what is a better strategy to ensure the inbound links aren't considered "broken links" and also to minimize any negative impact to our SEO? Should we delete the page and 301 redirect users to the parent page for the page we had previously hidden?
Intermediate & Advanced SEO | | jnew9290 -
Domain Authority: 23, Page Authority: 33, Can My Site Still Rank?
Greetings: Our New York City commercial real estate site is www.nyc-officespace-leader.com. Key MOZ metric are as follows: Domain Authority: 23
Intermediate & Advanced SEO | | Kingalan1
Page Authority: 33
28 Root Domains linking to the site
179 Total Links. In the last six months domain authority, page authority, domains linking to the site have declined. We have focused on removing duplicate content and low quality links which may have had a negative impact on the above metrics. Our ranking has dropped greatly in the last two months. Could it be due to the above metrics? These numbers seem pretty bad. How can I reverse without engaging in any black hat behavior that could work against me in the future? Ideas?
Thanks, Alan Rosinsky0 -
Do search engines crawl links on 404 pages?
I'm currently in the process of redesigning my site's 404 page. I know there's all sorts of best practices from UX standpoint but what about search engines? Since these pages are roadblocks in the crawl process, I was wondering if there's a way to help the search engine continue its crawl. Does putting links to "recent posts" or something along those lines allow the bot to continue on its way or does the crawl stop at that point because the 404 HTTP status code is thrown in the header response?
Intermediate & Advanced SEO | | brad-causes0 -
Do 404 pages pass link juice? And best practices...
Last year Google said bad links to 404 pages wouldn't hurt your site. Could that still be the case in light of recent Google updates to try and combat spammy links and negative SEO? Can links to 404 pages benefit a website and pass link juice? I'd assume at the very least that any link juice will pass through links FROM the 404 page? Many websites have great 404 pages that get linked to: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fretardzone.com%2F404 - that was the first of four I checked from the "60 Really Cool...404 Pages" that actually returned the 404 HTTP Status! So apologies if you find the word 'retard' offensive. According to Open Site Explorer it has a decent Page Authority and number of backlinks - but it doesn't show in Google's SERPs. I'd never do it, but if you have a particularly well-linked to 404 page, is there an argument for giving it 200 OK Status? Finally, what are the best practices regarding 404s and address bar links? For example, if
Intermediate & Advanced SEO | | Alex-Harford
www.examplesite.com/3rwdfs returns a 404 error, should I make that redirect to
www.examplesite.com/404 or leave it as is? Redirecting to www.examplesite.com/404 might not be user-friendly as people won't be able to correct the URL in the address bar. But if I have a great 404 page that people link to, I don't want links going to loads of random pages do I? Is either way considered best practice? If I did a 301 redirect I guess it would send the wrong signal to the crawlers? Should I use a 302 redirect, or even a 304 Not Modified redirect?1 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0