What tool do you use to check for URLs not indexed?
-
What is your favorite tool for getting a report of URLs that are not cached/indexed in Google & Bing for an entire site? Basically I want a list of URLs not cached in Google and a seperate list for Bing.
Thanks,
Mark
-
-
I can work on building this tool if there's enough interest.
-
I generally just use Xenu's hyperlink sleuth (if you thousands of pages) to listing out all the URLs you have got and I might then manually take a look at them, however, see the guitar in demand I have not come upon an automatic device yet. If all people are aware of any, I'd like to recognize as properly.
-
This post from Distilled mentions that SEO for Excel plugin has a "Indexation Checker":
https://www.distilled.net/blog/seo/awesome-examples-of-how-to-use-seotools-for-excel/Alas, after downloading and installing, it appears this feature was removed...
-
Unless I'm missing something, there doesn't seem to be a way to get Google to show more than 100 results on a page. Our site has about 8,000 pages, and I don't relish the idea of manually exporting 80 SERPs.
-
Annie Cushing from Seer Interactive made an awesome list of all the must have tools for SEO.
You can get it from her link which is http://bit.ly/tools-galore
In the list there is a tool called scrapebox which is great for this. In fact there are many uses for the software, it is also useful for sourcing potential link partners.
-
I would suggest using the Website Auditor from Advanced Web Ranking. It can parse 10.000 pages and it will tell you a lot more info than just if it's indexed by Google or not.
-
hmm...I thought there was a way to pull those SERPs urls into Google docs using a function of some sort?
-
I think you need not any tool for this, you can directly go to google.com and search: Site:www.YourWebsiteNem.com Site:www.YourWebsiteName.com/directory I think this will be the best option to check if your website is crwled by google or not.
-
I do something similar but use Advanced Web Ranking, use site:www.domain.com as your phrase, run it to retrieve 1000 results and generate a Top Site Report in Excel to get the indexed list.
Also remember that you can do it on sub-directories (or partial URL paths) as a way to get more than 1000 pages from the site. In general I run it once with site:www.domain.com, then identify the most frequent sub-directories, and add those as additional phrases to the project and run a second time, i.e.: site:www.domain.com site:www.domain.com/dir1 site:www.domain.com/dir2 etc.
Still not definitive, but think it does give indication of where value is.
-
David Kauzlaric has in my opinion the best answer. If google hasn't indexed it and you've investigated your Google webmaster account, then there isn't anything better out there as far as I'm concerned. It's by far the simplest, quickest and easiest way to identify a serp result.
re: David Kauzlaric
We built an internal tool to do it for us, but basically you can do this manually.
Go to google, type in "site:YOURURLHERE" without the quotes. You can check a certain page, a site, a subdomain, etc... of course if you have thousands of URLs this method is not ideal, but it can be done.
Cheers!
-
I concur, Xenu is an extremely valuable tool for me that I use daily. Also, once you get a list of all the URLs on your site, you can compare the two lists in excel (two lists being the Xenu page list for your site and the list of pages that have been indexed by Google).
-
Nice solution Kieran!
I use the same method, to compare URL list from Screaming Frog output with URL Found column from my Keyword Ranking tool - of course it doesn't catch all pages that might be indexed.
The intention is not really to get a complete list, more to "draught" out pages that need work.
-
I agree, this is not automated but so far, from what we know, looks like a nice and clean option. Thanks.
-
Saw this and tried the following which isn't automated but is one way of doing it.
- First install SEO Quake plugin
- Go to Google
- Turn off Google Instant (http://www.google.com/preferences)
- Go to Advanced search set the number of results you want displayed (estimate the number of pages on your site)
- Then run your site:www.example.com search query
- Export this to CSV
- Import to Excel
- Once then do a Data to columns conversion using ; as a delimiter (this is the CSV delimiter)
- This gives you a formatted list.
- Then import your sitemap.xml into another TAB in Excel
- Run a vlookup between the URL tabs to flag which are on sitemap or vice versa.
Not exactly automated but does the job.
-
Curious about this question also, it would be very useful to see a master list of all URLs on our site that are not indexed by Google so that we can take action to see what aspects of the page are lacking and what we need for it to get indexed.
-
I usually just use Xenu's link sleuth (if you thousands of pages) to list out all the URLs you have and I would then manually check them, but I haven't come across an automated tool yet. If anyone knows any, I'd love to know as well.
-
Manual is a no go for large sites. If someone knows a tool like this, it woul be cool to know which/ where to find. Or..... This would make a cool SEOmoz pro tool
-
My bad - you are right that it doesn't display the actual URLs. So I guess the best thing you can do is site:examplesite.com and see what comes up.
-
That will tell you the number indexed, but it still doesn't tell you which of those URLs are or are not indexed. I think we all wish it would!
-
I would use Google Webmaster Tools as you can see how many URLs are indexed based on your sitemap. Once you have that, you can compare it to your total list. The same can be done with Bing.
-
Yeah I do it manually now so was looking for something more efficient.
-
We built an internal tool to do it for us, but basically you can do this manually.
Go to google, type in "site:YOURURLHERE" without the quotes. You can check a certain page, a site, a subdomain, etc... of course if you have thousands of URLs this method is not ideal, but it can be done.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Japanese URL-structured sitemap (pages) not being indexed by Bing Webmaster Tools
Hello everyone, I am facing an issue with the sitemap submission feature in Bing Webmaster Tools for a Japanese language subdirectory domain project. Just to outline the key points: The website is based on a subdirectory URL ( example.com/ja/ ) The Japanese URLs (when pages are published in WordPress) are not being encoded. They are entered in pure Kanji. Google Webmaster Tools, for instance, has no issues reading and indexing the page's URLs in its sitemap submission area (all pages are being indexed). When it comes to Bing Webmaster Tools it's a different story, though. Basically, after the sitemap has been submitted ( example.com/ja/sitemap.xml ), it does report an error that it failed to download this part of the sitemap: "page-sitemap.xml" (basically the sitemap featuring all the sites pages). That means that no URLs have been submitted to Bing either. My apprehension is that Bing Webmaster Tools does not understand the Japanese URLs (or the Kanji for that matter). Therefore, I generally wonder what the correct way is to go on about this. When viewing the sitemap ( example.com/ja/page-sitemap.xml ) in a web browser, though, the Japanese URL's characters are already displayed as encoded. I am not sure if submitting the Kanji style URLs separately is a solution. In Bing Webmaster Tools this can only be done on the root domain level ( example.com ). However, surely there must be a way to make Bing's sitemap submission understand Japanese style sitemaps? Many thanks everyone for any advice!
Technical SEO | | Hermski0 -
Google tries to index non existing language URLs. Why?
Hi, I am working for a SAAS client. He uses two different language versions by using two different subdomains.
Technical SEO | | TheHecksler
de.domain.com/company for german and en.domain.com for english. Many thousands URLs has been indexed correctly. But Google Search Console tries to index URLs which were never existing before and are still not existing. de.domain.com**/en/company
en.domain.com/de/**company ... and an thousand more using the /en/ or /de/ in between. We never use this variant and calling these URLs will throw up a 404 Page correctly (but with wrong respond code - we`re fixing that 😉 ). But Google tries to index these kind of URLs again and again. And, I couldnt find any source of these URLs. No Website is using this as an out going link, etc.
We do see in our logfiles, that a Screaming Frog Installation and moz.com w opensiteexplorer were trying to access this earlier. My Question: How does Google comes up with that? From where did they get these URLs, that (to our knowledge) never existed? Any ideas? Thanks 🙂0 -
Google index graph duration in Google Webmaster Tools
Hello guys, I wonder, my sites are currently being indexed every 7 days, exactly. At Index Status page in GWT. However, this new site gets updated almost everyday, how can I ask google to index faster and more frequently/almost daily? Is it about SItemap.xml frequency ? I changed it today to Daily. Thanks!
Technical SEO | | mdmoz0 -
What may be the reason a sitemap is not indexed in Webmaster Tools?
Hi,
Technical SEO | | SorinaDascalu
I have a problem with a client's website. I searched many related questions here about the same problem but couldn't figure out a solution. Their website is in 2 languages and they submitted 2 sitemaps to Webmaster Tools. One got 100% indexed. From the second one, from over 800 URLs only 32 are indexed. I checked the following hypothesis why the second sitemap may not get indexed: sitemap is wrongly formatted - False sitemap contains URLs that don't return 200 status - False, there are no URLs that return 404, 301 or 302 status codes sitemap contains URLs that are blocked by robots.txt - False internal duplicate content problems - False issues with meta canonical tags - False For clarification, URLs from the sitemap that is not indexed completely also don't show up in Google index. Can someone tell me what can I also check to fix this issue?0 -
Structure of urls
**Hallo from Athens, Greece. We have to implement the following project and i need your help: ** We will build a company guide for the whole country and company local guides for each city for the same client. **Information of the country guide is the sum of information of local guides, so when a user is at the country guide he sees information from companies from all cities and when the user is at city guide he sees info only for the city. ** The problem is the structure of the url we should have. Should the page of presentation of each company should have structure as domain.gr/id/company? or city.domain.gr/id/company and the one to be canonical to the other? is this good for seo? Should both urls be included in the sitemap? Thank you
Technical SEO | | herculesopa0 -
Google webmaster tools says access denied for 77 urls
Hi i am looking in google webmaster tools and i have seen a major problem which i hope people can help me sort out. The problem is, i am being told that 77 urls are being denied access. The message when i look for more information says the below Googlebot couldn't crawl your URL because your server either requires login to access the page, or is blocking Googlebot from accessing your site. the responce code is 403 here is a couple of examples http://www.in2town.co.uk/Entertainment-Magazine http://www.in2town.co.uk/Weight-Loss-Hypnotherapy-helped-woman-lose-3-stone i think the problem could be that i have sent them to another url in my httaccess file using the 403 re-direct but why would it bring up that google bot could not crawl them any help would be great
Technical SEO | | ClaireH-1848860 -
We changed the URL structure 10 weeks ago and Google hasn't indexed it yet...
We recently modified the whole URL structure on our website, which resulted in huge amount of 404 pages changing them to nice human readable urls. We did this in the middle of March - about 10 weeks ago... We used to have around 5000 404 pages in the beginning, but this number is decreasing slowly. (We have around 3000 now). On some parts of the website we have also set up a 301 redirect from the old URLs to the new ones, to avoid showing a 404 page thus making the “indexing transmission”, but it doesn’t seem to have made any difference. We've lost a significant amount of traffic, because of the URL changes, as Google removed the old URLs, but hasn’t indexed our new URLs yet. Is there anything else we can do to get our website indexed with the new URL structure quicker? It might also be useful to know that we are a page rank 4 and have over 30,000 unique users a month so I am sure Google often comes to the site quite often and pages we have made since then that only have the new url structure are indexed within hours sometimes they appear in search the next day!
Technical SEO | | jack860 -
Why Google did not index our domain?
Hi, We launched tmart 60 days ago and submitted to google, bing, yahoo 20 days later. But google had never indexed our website still when yahoo indexed it in one week. What we have checked or tried: 1. We got 20~50 inlinks in one month and now 81 inlinks via yahoo site explorer. 2. This domain has registered for 13 years and we purchased it from sedo last year. We
Technical SEO | | zt673
did not find any problems from domain archive pages. 3. Page similar: the homepage is 50% similar to one of our competitors when we just launched.
So we adjusted the page structure and modified the content one month later and decreased the similarity to 30% (by tools from webconfs.com) 4. Google Robots: googlebot crawled our website every day after we submitted for indexing.
We opened GWT account for it and added the xml sitemap last week. GWT said nothing
was wrong except the time of page loading. Our questions: Why google did not indexed our website? What should we do? Thanks, wu0