How can I make a list of all URLs indexed by Google?
-
I started working for this eCommerce site 2 months ago, and my SEO site audit revealed a massive spider trap.
The site should have been 3500-ish pages, but Google has over 30K pages in its index. I'm trying to find a effective way of making a list of all URLs indexed by Google.
Anyone?
(I basically want to build a sitemap with all the indexed spider trap URLs, then set up 301 on those, then ping Google with the "defective" sitemap so they can see what the site really looks like and remove those URLs, shrinking the site back to around 3500 pages)
-
If you can get a developer to create a list of all the pages Google has crawled within a date range then you can use this python script to check if the page is indexed or not.
http://searchengineland.com/check-urls-indexed-google-using-python-259773
The script uses the info: search feature to check the urls.
You will have to install Python, Tor and Polipo for this to work. It is quite technical so if you aren't a technical person you may need help.
Depending on how many URL's you have and how long you decide to wait before checking each URL, it can take a few hours.
-
Thanks for your input guys! I've almost landed on the following approach:
- Use this http://www.chrisains.com/seo-tools/extract-urls-from-web-serps/ to collect a number (3-600) of URLs based on the various problem URL-footprints.
- Make XML "problem sitemaps" based on above URLs
- Implement 301s
- Ping the search engines with the XML "problem sitemaps", so that these may discover changes and see what the site really looks like (ideally reducing the # of indexed pages by about 85%)
- Track SE traffic as well as index for each URL footprint once a week for 6-8 weeks and follow progress
- If progress is not satisfactory, then go the URL Profiler route.
Any thoughts before I go ahead?
-
URL profiler will do this, as well as the other recommend scraper sites.
-
URL Profiler might be worth checking out:
It does require that you use a proxy, since Google does not like you scraping their search results.
-
Im sorry to confirm you that google does not want to everyine know that they have in their index. We as SEOs complain about that.
Its hard to belive that you couldnt get all your pages with a scraper. (because it just searches and gets the SERPS)
-
I tried thiss and a few others http://www.chrisains.com/seo-tools/extract-urls-from-web-serps/. This gave me about 500-1000 URLs at a time, but included a lot of cut and paste back and forth.
I imagine there must be a much easier way of doing this...
-
Well, There are some scrapers that might do that job.
To do it the right way you will need proxies and a scraper.
My recommendation is Gscraper or Scrapebox and a list of (at list) 10 proxies.Then, just make a scrape whit the "site:mydomain.com" and see what you get.
(before buying proxies or any scraper, check if you get something like you want with the free stuff) -
I used Screaming to discover the spider trap (and more), but as far as I know, I cannot use Screaming to import all URLs that Google actually has in its index (or can I?).
A list of URLs actually in Googles index is what I'm after
-
Hi Sverre,
Have you tried Screaming Frog SEO Spider? Here a link to it: https://www.screamingfrog.co.uk/seo-spider/
It's really helpfull to crawl all the pages you have as accesible for spiders. You might need the premium version to crawl over 500 pages.
Also, have you checked for the common duplicate pages issues? Here a Moz tutorial: https://moz.com/learn/seo/duplicate-content
Hope it helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I change a URL on a site that has only a few back links?
I have a site that wants to change their URL, It's a very basic site with hardly any backlinks. http://www.cproofingandexteriors.com/ The only change they want to make is taking out the 'and'.. so it would be cproofingexteriors.com they already own the domain. What should I do?? Thanks
Intermediate & Advanced SEO | | MissThumann0 -
How to change URL structure in google webmasters
Is there any way to ask google to indexed the website in following URL structure abc.com/category/postname (I have this structure on my website) But Currently google indexed my website posts as - abc.com/postname/category How I can tell google to follow the right structure?
Intermediate & Advanced SEO | | Michael.Leonard0 -
How to de-index old URLs after redesigning the website?
Thank you for reading. After redesigning my website (5 months ago) in my crawl reports (Moz, Search Console) I still get tons of 404 pages which all seems to be the URLs from my previous website (same root domain). It would be nonsense to 301 redirect them as there are to many URLs. (or would it be nonsense?) What is the best way to deal with this issue?
Intermediate & Advanced SEO | | Chemometec0 -
Language Subdirectory homepage not indexed by Google
Hi mozzers, Our Spanish homepage doesn't seem to be indexed or cached in Google, despite being online for over a month or two. All Spanish subpages are indexed and have started to rank but not the homepage. I have submitted sitemap xml to GWTools and have checked there's no noindex on the page - it seems to be in order. And when I run site: command in Google it shows all pages except homepage. What could be the problem? Here's the page: http://www.bosphorusyacht.com/es/
Intermediate & Advanced SEO | | emerald0 -
Google Maps Integration Dynamic url
We are integrating Google Maps into a search feature on a website. Would you use the standard dynamic generated long url that appears after a search or find a way of reducing this to a shorter url. Taking into account hundreds of results. Question asked for seo purposes.
Intermediate & Advanced SEO | | jazavide0 -
How to get content to index faster in Google.....pubsubhubbub?
I'm curious to know what tools others are using to get their content to index faster (other than html sitmap and pingomatic, twitter, etc) Would installing the wordpress pubsubhubbub plugin help even though it uses pingomatic? http://wordpress.org/extend/plugins/pubsubhubbub/
Intermediate & Advanced SEO | | webestate0 -
How long does it take before URL's are removed from Google?
Hello, I recently changed our websites url structures removing the .html at the end. I had about 55 301's setup from the old url to the new. Within a day all the new URL's were listed in Google, but the old .html ones still have not been removed a week later. Is there something I am missing? Or will it just take time for them to get de-indexed? As well, so far the Page Authority hasn't transfered from the old pages to the new, is this typical? Thanks!
Intermediate & Advanced SEO | | SeanConroy0 -
Can you explain why the site is dropping off Google every other week?
Can anyone offer any insight into why since the Google Panda update www.bedandbreakfastsguide.com has been fluctuating on Google so much? One week it's ranked as it used to be, the next it's nowhere to be seen? If you take a look at the screenshot of our traffic, this is the traffic after 75% loss (dropped in two stages) you'll see we get traffic for a week and then nothing. This has been happening for months. Some points that might be involved: Around the same time the SEO guys suggested setting the canonical url to www.bedandbreakfastsguide.com (before there wasn't one so traffic was coming from www. and non-www). A lot of the original urls have been consolidated and rel="canonical" added throughout The "pages" of results all have had a rel="canonical" set to page 1 Could it be that the www is competing with the non-www despite the 301 redirects. We're doing everything we can to help this client (and have reduced their site errors from the millions to low tens-of-thousands) so it's not filling them with confidence when their site just keeps plumetting! What's also irritating/odd is that some of their competitors -who used to be ranked lower and have sites which contradict every rulebook still rank high. Hopefully you can spot something we've missed. Tim I8PNL
Intermediate & Advanced SEO | | TimGaunt0