What tool do you use to check for URLs not indexed?
-
What is your favorite tool for getting a report of URLs that are not cached/indexed in Google & Bing for an entire site? Basically I want a list of URLs not cached in Google and a seperate list for Bing.
Thanks,
Mark
-
-
I can work on building this tool if there's enough interest.
-
I generally just use Xenu's hyperlink sleuth (if you thousands of pages) to listing out all the URLs you have got and I might then manually take a look at them, however, see the guitar in demand I have not come upon an automatic device yet. If all people are aware of any, I'd like to recognize as properly.
-
This post from Distilled mentions that SEO for Excel plugin has a "Indexation Checker":
https://www.distilled.net/blog/seo/awesome-examples-of-how-to-use-seotools-for-excel/Alas, after downloading and installing, it appears this feature was removed...
-
Unless I'm missing something, there doesn't seem to be a way to get Google to show more than 100 results on a page. Our site has about 8,000 pages, and I don't relish the idea of manually exporting 80 SERPs.
-
Annie Cushing from Seer Interactive made an awesome list of all the must have tools for SEO.
You can get it from her link which is http://bit.ly/tools-galore
In the list there is a tool called scrapebox which is great for this. In fact there are many uses for the software, it is also useful for sourcing potential link partners.
-
I would suggest using the Website Auditor from Advanced Web Ranking. It can parse 10.000 pages and it will tell you a lot more info than just if it's indexed by Google or not.
-
hmm...I thought there was a way to pull those SERPs urls into Google docs using a function of some sort?
-
I think you need not any tool for this, you can directly go to google.com and search: Site:www.YourWebsiteNem.com Site:www.YourWebsiteName.com/directory I think this will be the best option to check if your website is crwled by google or not.
-
I do something similar but use Advanced Web Ranking, use site:www.domain.com as your phrase, run it to retrieve 1000 results and generate a Top Site Report in Excel to get the indexed list.
Also remember that you can do it on sub-directories (or partial URL paths) as a way to get more than 1000 pages from the site. In general I run it once with site:www.domain.com, then identify the most frequent sub-directories, and add those as additional phrases to the project and run a second time, i.e.: site:www.domain.com site:www.domain.com/dir1 site:www.domain.com/dir2 etc.
Still not definitive, but think it does give indication of where value is.
-
David Kauzlaric has in my opinion the best answer. If google hasn't indexed it and you've investigated your Google webmaster account, then there isn't anything better out there as far as I'm concerned. It's by far the simplest, quickest and easiest way to identify a serp result.
re: David Kauzlaric
We built an internal tool to do it for us, but basically you can do this manually.
Go to google, type in "site:YOURURLHERE" without the quotes. You can check a certain page, a site, a subdomain, etc... of course if you have thousands of URLs this method is not ideal, but it can be done.
Cheers!
-
I concur, Xenu is an extremely valuable tool for me that I use daily. Also, once you get a list of all the URLs on your site, you can compare the two lists in excel (two lists being the Xenu page list for your site and the list of pages that have been indexed by Google).
-
Nice solution Kieran!
I use the same method, to compare URL list from Screaming Frog output with URL Found column from my Keyword Ranking tool - of course it doesn't catch all pages that might be indexed.
The intention is not really to get a complete list, more to "draught" out pages that need work.
-
I agree, this is not automated but so far, from what we know, looks like a nice and clean option. Thanks.
-
Saw this and tried the following which isn't automated but is one way of doing it.
- First install SEO Quake plugin
- Go to Google
- Turn off Google Instant (http://www.google.com/preferences)
- Go to Advanced search set the number of results you want displayed (estimate the number of pages on your site)
- Then run your site:www.example.com search query
- Export this to CSV
- Import to Excel
- Once then do a Data to columns conversion using ; as a delimiter (this is the CSV delimiter)
- This gives you a formatted list.
- Then import your sitemap.xml into another TAB in Excel
- Run a vlookup between the URL tabs to flag which are on sitemap or vice versa.
Not exactly automated but does the job.
-
Curious about this question also, it would be very useful to see a master list of all URLs on our site that are not indexed by Google so that we can take action to see what aspects of the page are lacking and what we need for it to get indexed.
-
I usually just use Xenu's link sleuth (if you thousands of pages) to list out all the URLs you have and I would then manually check them, but I haven't come across an automated tool yet. If anyone knows any, I'd love to know as well.
-
Manual is a no go for large sites. If someone knows a tool like this, it woul be cool to know which/ where to find. Or..... This would make a cool SEOmoz pro tool
-
My bad - you are right that it doesn't display the actual URLs. So I guess the best thing you can do is site:examplesite.com and see what comes up.
-
That will tell you the number indexed, but it still doesn't tell you which of those URLs are or are not indexed. I think we all wish it would!
-
I would use Google Webmaster Tools as you can see how many URLs are indexed based on your sitemap. Once you have that, you can compare it to your total list. The same can be done with Bing.
-
Yeah I do it manually now so was looking for something more efficient.
-
We built an internal tool to do it for us, but basically you can do this manually.
Go to google, type in "site:YOURURLHERE" without the quotes. You can check a certain page, a site, a subdomain, etc... of course if you have thousands of URLs this method is not ideal, but it can be done.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migration + Change of Address Tool used - previous site de-indexed!!
OMG disaster! Recently migrated my site womencycles.com to moonrise.health. Painstakingly went through each URL manually to map out redirects, notified Google via change of address tool. Bam. My old website has disappeared from Google and my new site has thus lost all it's organic (i.e. redirected) traffic. I don't get it. I think I have done everything by the book, but it seems my old site has disappeared and no authority or link juice has been passed to my new site by the 301s, as the new site isn't ranking either. Some examples: https://www.google.com/search?q=women+cycles&oq=women+cycles&aqs=chrome..69i57j69i65j69i61l2j69i60.1834j0j1&sourceid=chrome&ie=UTF-8 'women cycles' previous position 1
Technical SEO | | tikitaka
https://www.google.com/search?q=chaffed+vagina&oq=chaffed+vagina&aqs=chrome..69i57.2370j0j1&sourceid=chrome&ie=UTF-8 - chaffed vagina, previous position 1 https://www.google.com/search?q=how+long+does+it+take+turmeric+to+shrink+fibroids&oq=how+long+does+it+take+turmeric+to+shrink+fibroids&aqs=chrome..69i57.1355j0j1&sourceid=chrome&ie=UTF-8 - how long does it take turmeric to shrink fibroids, previous position 1. Biggest traffic source pages were: https://womencycles.com/blog/top-10-home-remedies-that-claim-to-tighten-vagina-do-they-work/
https://womencycles.com/blog/sore-breasts-after-period-has-finished/
https://womencycles.com/blog/what-is-vaginal-gas-queefing/
https://womencycles.com/blog/tired-during-ovulation/
https://womencycles.com/blog/how-to-get-rid-of-saggy-vag-without-surgery/
https://womencycles.com/blog/vagina-chafing-causes-treatments-to-prevent-it-from-coming-back/
https://womencycles.com/blog/vaginal-dryness-during-pregnancy/ New blog articles on new site, with 301 redirect in place, but not ranking Screenshot shows my search traffic for my new site. Site migrated 13 June. Any ideas anyone??!Screenshot 2022-06-28 at 13.27.41.png0 -
Urls Too Long - Should I shorten?
On the crawl of our website we have had a warning that 157 have urls that are too long. When I look at the urls they are generally from 2016 or earlier. Should I just leave them as they are or shorten the urls and redirect to new url? Thanks
Technical SEO | | DaleZon4 -
Can you use a seperate url for a interior product page on a site?
I have a friend that has a health insurance agency site. He wants to add a new page, for child health care insurance to his existing site. But the issue is, he brought a new URL; insurancemykidnow.com and he want's to use it for the new page. Now, I'm not sure I'm right on this, but I don't think that can be done? I'm I wrong? = Thanks in advance.
Technical SEO | | Coppell0 -
Bulk URL Removal in Webmaster Tools
One of Wordpress sites was hacked (for about 10 hours), and Google picked up 4000+ urls in the index. The site is fixed, but I'm stuck with all those urls in the index. All the urls of of the form: walkerorthodontics.com/index.php?online-payday-cash-loan.htmloncewe The only bulk removal option I could find was to remove an entire folder, but I can't do that, as it would only leave the homepage and kill off everything else. For some crazy reason, the removal tools doesn't support wildcards, so that obvious solution is right out. So, how do it get rid of 4000 results? And no, waiting around for them to 404 out of the index isn't an option.
Technical SEO | | MichaelGregory0 -
Has Google Stopped Listing URLs with Crawl Errors in Webmaster Tools?
I went to Google Webmaster Tools this morning and found that one of my clients had 11 crawl errors. However, Webmaster Tools is not showing which URLs are having experiencing the errors, which it used to do. (I checked several other clients that I manage and they list crawl errors without showing the specific URLs. Does anyone know how I can find out which URLs are experiencing problems? (I checked with Bing Webmaster Tools and the number of errors are different).
Technical SEO | | TopFloor0 -
How to use rel canonical?
Hi, I am having some questions about this and I think you can help me on this. Here I have the example of my problem: pagination: Suppose that I have a new with 2 pages http://www.espectador.com/noticias/208907/fernando-pereira-encuesta-de-cifra-prendio-una-lucecita-amarilla-en-el-pit-cnt you can access the first page by different ways: www.espectador.com/1v4_contenido.php?m=&id=250419&ipag=1 http://www.espectador.com/1v4_contenido.php?m=&id=250419 http://www.espectador.com/noticias/250419/alvaro-vega-fa-creo-que-cosmo-fue-usada-por-bqb-para-evitar-una-subasta-a-la-baja-y-asi-quedar-con-las-manos-libres Same meta descr, same body with different URLs. Can I use rel canonical in the file 1v4_contenido.php that point to the friendly url? <link rel="<a class="attribute-value">canonical</a>" href="[http://www.espectador.com/noticias/250419/alvaro-vega-fa-creo-que-cosmo-fue-usada-por-bqb-para-evitar-una-subasta-a-la-baja-y-asi-quedar-con-las-manos-libres](view-source:http://www.espectador.com/noticias/250419/alvaro-vega-fa-quotcreo-que-cosmo-fue-usada-por-bqb-para-evitar-una-subasta-a-la-bajaquot-y-asi-quotquedar-con-las-manos-libresquot)"/> do I have a loop here? The rel canonical can goes in the page 1? Thanks
Technical SEO | | informatica8100 -
Best free tool to check internal broken links
Question says it all I guess. What would your recommend as the best free tool to check internal broken links?
Technical SEO | | RikkiD225 -
Remove Deleted (but indexed) Pages Through Webmaster Tools?
I run a blog/directory site. Recently, I changed directory software and, as a result, Google is showing 404 Not Found crawling errors for about 750 non-existent pages. I've had some suggest that I should implement a 301 redirect, but can't see the wisdom in this as the pages are obscure, unlikely to appear in search and they've been deleted. Is the best course to simply manually enter each 404 error page in to the Remove Page option in Webmaster Tools? Will entering deleted pages into the Removal area hurt other healthy pages on my site?
Technical SEO | | JSOC0