undefined
Skip to content
Moz logo Menu open Menu close
  • Products
    • Moz Pro
    • Moz Pro Home
    • Moz Local
    • Moz Local Home
    • STAT
    • Moz API
    • Moz API Home
    • Compare SEO Products
    • Moz Data
  • Free SEO Tools
    • Domain Analysis
    • Keyword Explorer
    • Link Explorer
    • Competitive Research
    • MozBar
    • More Free SEO Tools
  • Learn SEO
    • Beginner's Guide to SEO
    • SEO Learning Center
    • Moz Academy
    • MozCon
    • Webinars, Whitepapers, & Guides
  • Blog
  • Why Moz
    • Digital Marketers
    • Agency Solutions
    • Enterprise Solutions
    • Small Business Solutions
    • The Moz Story
    • New Releases
  • Log in
  • Log out
  • Products
    • Moz Pro

      Your all-in-one suite of SEO essentials.

    • Moz Local

      Raise your local SEO visibility with complete local SEO management.

    • STAT

      SERP tracking and analytics for enterprise SEO experts.

    • Moz API

      Power your SEO with our index of over 44 trillion links.

    • Compare SEO Products

      See which Moz SEO solution best meets your business needs.

    • Moz Data

      Power your SEO strategy & AI models with custom data solutions.

    Let your business shine with Listings AI
    Moz Local

    Let your business shine with Listings AI

    Learn more
  • Free SEO Tools
    • Domain Analysis

      Get top competitive SEO metrics like DA, top pages and more.

    • Keyword Explorer

      Find traffic-driving keywords with our 1.25 billion+ keyword index.

    • Link Explorer

      Explore over 40 trillion links for powerful backlink data.

    • Competitive Research

      Uncover valuable insights on your organic search competitors.

    • MozBar

      See top SEO metrics for free as you browse the web.

    • More Free SEO Tools

      Explore all the free SEO tools Moz has to offer.

    NEW Keyword Suggestions by Topic
    Moz Pro

    NEW Keyword Suggestions by Topic

    Learn more
  • Learn SEO
    • Beginner's Guide to SEO

      The #1 most popular introduction to SEO, trusted by millions.

    • SEO Learning Center

      Broaden your knowledge with SEO resources for all skill levels.

    • On-Demand Webinars

      Learn modern SEO best practices from industry experts.

    • How-To Guides

      Step-by-step guides to search success from the authority on SEO.

    • Moz Academy

      Upskill and get certified with on-demand courses & certifications.

    • MozCon

      Save on Early Bird tickets and join us in London or New York City

    Unlock flexible pricing & new endpoints
    Moz API

    Unlock flexible pricing & new endpoints

    Find your plan
  • Blog
  • Why Moz
    • Digital Marketers

      Simplify SEO tasks to save time and grow your traffic.

    • Small Business Solutions

      Uncover insights to make smarter marketing decisions in less time.

    • Agency Solutions

      Earn & keep valuable clients with unparalleled data & insights.

    • Enterprise Solutions

      Gain a competitive edge in the ever-changing world of search.

    • The Moz Story

      Moz was the first & remains the most trusted SEO company.

    • New Releases

      Get the scoop on the latest and greatest from Moz.

    Surface actionable competitive intel
    New Feature

    Surface actionable competitive intel

    Learn More
  • Log in
    • Moz Pro
    • Moz Local
    • Moz Local Dashboard
    • Moz API
    • Moz API Dashboard
    • Moz Academy
  • Avatar
    • Moz Home
    • Notifications
    • Account & Billing
    • Manage Users
    • Community Profile
    • My Q&A
    • My Videos
    • Log Out

The Moz Q&A Forum

  • Forum
  • Questions
  • Users
  • Ask the Community

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

  1. Home
  2. SEO Tactics
  3. Intermediate & Advanced SEO
  4. Mass Removal Request from Google Index

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Mass Removal Request from Google Index

Intermediate & Advanced SEO
4
8
1.8k
Loading More Posts
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as question
Log in to reply
This topic has been deleted. Only users with question management privileges can see it.
  • ioannisa
    ioannisa last edited by Apr 12, 2016, 7:00 PM

    Hi,

    I am trying to cleanse a news website.  When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts.  This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012.  So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012!

    Therefore

    1. My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now
    2. Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article.

    The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results?

    I know that for individual URLs I need to request removal from this link
    https://www.google.com/webmasters/tools/removals

    The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove.  Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404?  I believe this is very wrong.  As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools.

    Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
    https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires>

    The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
    http://www.example.com/docid=123456

    So, how can I bulk remove from the google index all the junk... relatively fast?

    1 Reply Last reply Reply Quote 0
    • KristinaKledzik
      KristinaKledzik @ioannisa last edited by Apr 15, 2016, 5:45 PM Apr 15, 2016, 5:45 PM

      Hi Ioannis,

      What about the first suggestion? Can you create a page linking to all of the pages that you'd like to remove, then have Google crawl that page?

      Best,

      Kristina

      1 Reply Last reply Reply Quote 0
      • ioannisa
        ioannisa last edited by Apr 15, 2016, 4:18 AM Apr 15, 2016, 4:05 AM

        Thank you Kristina,

        I know about the URL structure, I have been trying the past few months to cleanse this site that I was not involved in its creation.  It has several more SEO problems that have either been fixed or not yet, but we are talking about more than 50 SEO problems I've found so far - most of these critical.

        On the sitemap that I built, the junk pages do not exist, and because this is sitemap I have written myself, I can easily make another containing the articles that I have removed (just reverse a part of my select query for the sitemap to get the ones I have removed).

        http://www.neakriti.gr/webservices/sitemap-index.aspx

        So far I implemented the last of your suggestions and here is an example:

        This is a valid article page
        http://www.neakriti.gr/?page=newsdetail&DocID=1314221 - (Status Code: 200)

        This is a non existent article page (never existed at the first place) - (Status Code: 404)
        http://www.neakriti.gr/?page=newsdetail&DocID=12345678

        This is one of the articles that I removed from sitemap and site - (Status Code: 410)
        http://www.neakriti.gr/?page=newsdetail&DocID=894052

        Also I would like you to take a look at another question about the same site and see that it can relate to this question with garbage articles too...
        https://moz.com/community/q/multiple-instances-of-the-same-article

        Thank you so much!

        KristinaKledzik 1 Reply Last reply Apr 15, 2016, 5:45 PM Reply Quote 0
        • KristinaKledzik
          KristinaKledzik last edited by May 11, 2016, 7:55 AM Apr 14, 2016, 2:32 PM

          Hi Ioannis,

          You're in quite a bind here, without a good URL structure! I don't think there's any one perfect option, but I think all of these will work:

          • Create a page on your site that links to every article you would like to delete, keeping those articles 404/410ed. Then, use the Fetch as Googlebot tool, and ask Google to crawl the page plus all of its links. This will get Google to quickly crawl all of those pages, see that they're gone, and remove them from their index. Keep in mind that if you just use a 404, Google may keep the page around for a bit to make sure you didn't just mess up. As Eric said, a 410 is more of a sure thing.
          • Create an XML sitemap of those deleted articles, and have Google crawl it. Yes, this will create errors in GSC, but errors in GSC mean that they're concerned you've made a mistake, not that they're necessarily penalizing you. Just mark those guys as fixed and take the sitemap down once Google's crawled it.
          • 410 these pages, remove all internal links to them (use a tool like Screaming Frog to make sure you didn't miss any links!), and remove them from your sitemap. That'll distance you from that old, crappy content, and Google will slowly realize that it's been removed as it checks in on its old pages. This is probably the least satisfying option, but it's an option that'll get the job done eventually.

          Hope this helps! Let us know what you decide to do.

          Best,

          Kristina

          1 Reply Last reply Reply Quote 1
          • ioannisa
            ioannisa last edited by Apr 13, 2016, 2:15 PM Apr 13, 2016, 2:13 PM

            Thank you,

            so you suggest that based on my date based query, instead of blocking everything before that date blindly, keep blocking it with 410, while anything that doesn't exist anyway return 404.

            Also another question, about the blocked articles that return 410, should I put their URLs back on the xml sitemap or not?

            1 Reply Last reply Reply Quote 0
            • GlobeRunner
              GlobeRunner last edited by May 11, 2016, 7:55 AM Apr 13, 2016, 12:29 PM

              Any article that has release date prior to 1st-June-2012 should return a custom 410 page with "noindex" metatag, instead of the actual content of the article.

              The error returned should be a "410 gone" and not just a 404. That way Google will treat it differently, and may remove it from the index faster than just returning a 404. Also, you can use the Google removal tool, as well. Don't forget the robots.txt file, as well, there may be directories with the content that you need to disallow.

              But overall, using a 410 is going to be better and most likely faster.

              1 Reply Last reply Reply Quote 2
              • ioannisa
                ioannisa last edited by Apr 13, 2016, 8:45 AM Apr 13, 2016, 5:44 AM

                Thank you for your response.

                I defenintelly cannot use noindex because as I explained I changed all articles prior to the minimum given date to return 404.  So this content is not visibly available on the web in order to contain a noindex directive.  Unless you mean to have it at my custom 404 page, where yes its there.

                Also there is no folder to associate in robots, since they are in ugly form of GET params like DOCID=12345.  So given that, there are thousands of DocIDs that are junk and removed, and thousands that are the actuall articles.

                So I assumed that creating a "deleted articles" sitemap where each <url>will contain an <expires>2016-06-01</expires> tag seemed the most logical thing, but I am afraid its for "custom search engines", rather than for normal de-index requests as its provided bellow</url>

                https://developers.google.com/custom-search/docs/indexing#on-demand-indexing

                1 Reply Last reply Reply Quote 0
                • Martijn_Scheijbeler
                  Martijn_Scheijbeler last edited by Apr 13, 2016, 5:02 AM Apr 13, 2016, 5:02 AM

                  Sitemaps is definitely not the way to go for this as you can't just have an expires tag in there and it would make pages go away. The best option to go with is the meta robots and then put them either on nonindex, nofollow, or noindex, follow. With this approach and hopefully with a relative high crawl rate you can make sure that the data from these pages will be removed from the Google Index as soon as possible.

                  If you still want these pages to be indexed but maybe just not have them crawled anymore, which I don't think you'd like to do based on your explanation then go with robots.txt and excluding the pages in there that you'd like to.

                  1 Reply Last reply Reply Quote 2
                  • 1 / 1
                  1 out of 8
                  • First post
                    1/8
                    Last post

                  Got a burning SEO question?

                  Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                  Start my free trial


                  Browse Questions

                  Explore more categories

                  • Moz Tools

                    Chat with the community about the Moz tools.

                  • SEO Tactics

                    Discuss the SEO process with fellow marketers

                  • Community

                    Discuss industry events, jobs, and news!

                  • Digital Marketing

                    Chat about tactics outside of SEO

                  • Research & Trends

                    Dive into research and trends in the search industry.

                  • Support

                    Connect on product support and feature requests.

                  • See all categories

                  Related Questions

                  • peteboyd

                    URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?

                    A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.

                    Intermediate & Advanced SEO | Jun 28, 2015, 1:03 AM | peteboyd
                    0
                  • bondhoward

                    Google indexed wrong pages of my website.

                    When I google site:www.ayurjeewan.com, after 8 pages, google shows Slider and shop pages. Which I don't want to be indexed. How can I get rid of these pages?

                    Intermediate & Advanced SEO | Feb 26, 2015, 9:48 PM | bondhoward
                    0
                  • friendoffood

                    Removing UpperCase URLs from Indexing

                    This search   -  site:www.qjamba.com/online-savings/automotix gives me this result from Google: Automotix online coupons and shopping - Qjamba
                    https://www.qjamba.com/online-savings/automotix
                    Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. and Google tells me there is another one, which is 'very simliar'.  When I click to see it I get: Automotix online coupons and shopping - Qjamba
                    https://www.qjamba.com/online-savings/Automotix
                    Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products. This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended. I assume that having 2 indexed urls for the same content dilutes link juice.  Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls?  And if, so what is the best way -- there are thousands.

                    Intermediate & Advanced SEO | Dec 12, 2014, 6:34 PM | friendoffood
                    0
                  • kchandler

                    Proper 301 in Place but Old Site Still Indexed In Google

                    So i have stumbled across an interesting issue with a new SEO client. They just recently launched a new website and implemented a proper 301 redirect strategy at the page level for the new website domain. What is interesting is that the new website is now indexed in Google BUT the old website domain is also still indexed in Google? I even checked the Google Cached date and it shows the new website with a cache date of today. The redirect strategy has been in place for about 30 days. Any thoughts or suggestions on how to get the old domain un-indexed in Google and get all authority passed to the new website?

                    Intermediate & Advanced SEO | Oct 23, 2014, 6:05 AM | kchandler
                    0
                  • edlondon

                    Google Not Indexing XML Sitemap Images

                    Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT.  If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'.  That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt.  As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in.  Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change.  But over a week later, that seems to have had no impact.  The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this.  I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ

                    Intermediate & Advanced SEO | Feb 19, 2014, 8:51 AM | edlondon
                    0
                  • Tone_Agency

                    Wordpress blog in a subdirectory not being indexed by Google

                    HI MozzersIn my websites sitemap.xml, pages are listed, such as /blog/ and /blog/textile-fact-or-fiction-egyptian-cotton-explained/These pages are visible when you visit them in a browser and when you use the Google Webmaster tool - Fetch as Google to view them (see attachment), however they aren't being indexed in Google, not even the root directory for the blog (/blog/) is being indexed, and when we query:site: www.hilden.co.uk/blog/ It returns 0 results in Google.Also note that:The Wordpress installation is located at /blog/ which is a subdirectory of the main root directory which is managed by Magento. I'm wondering if this causing the problem.Any help on this would be greatly appreciated!AnthonyToTOHuj.png?1

                    Intermediate & Advanced SEO | May 10, 2013, 9:58 AM | Tone_Agency
                    0
                  • seoppc2012

                    Does Google index url with hashtags?

                    We are setting up some Jquery tabs in a page that will produce the same url with hashtags. For example: index.php#aboutus, index.php#ourguarantee, etc. We don't want that content to be crawled as we'd like to prevent duplicate content. Does Google normally crawl such urls or does it just ignore them? Thanks in advance.

                    Intermediate & Advanced SEO | Oct 16, 2013, 4:56 PM | seoppc2012
                    0
                  • nicole.healthline

                    Tool to calculate the number of pages in Google's index?

                    When working with a very large site, are there any tools that will help you calculate the number of links in the Google index? I know you can use site:www.domain.com to see all the links indexed for a particular url. But what if you want to see the number of pages indexed for 100 different subdirectories (i.e. www.domain.com/a, www.domain.com/b)? is there a tool to help automate the process of finding the number of pages from each subdirectory in Google's index?

                    Intermediate & Advanced SEO | Jun 26, 2011, 6:32 PM | nicole.healthline
                    0

                  Get started with Moz Pro!

                  Unlock the power of advanced SEO tools and data-driven insights.

                  Start my free trial
                  Products
                  • Moz Pro
                  • Moz Local
                  • Moz API
                  • Moz Data
                  • STAT
                  • Product Updates
                  Moz Solutions
                  • SMB Solutions
                  • Agency Solutions
                  • Enterprise Solutions
                  • Digital Marketers
                  Free SEO Tools
                  • Domain Authority Checker
                  • Link Explorer
                  • Keyword Explorer
                  • Competitive Research
                  • Brand Authority Checker
                  • Local Citation Checker
                  • MozBar Extension
                  • MozCast
                  Resources
                  • Blog
                  • SEO Learning Center
                  • Help Hub
                  • Beginner's Guide to SEO
                  • How-to Guides
                  • Moz Academy
                  • API Docs
                  About Moz
                  • About
                  • Team
                  • Careers
                  • Contact
                  Why Moz
                  • Case Studies
                  • Testimonials
                  Get Involved
                  • Become an Affiliate
                  • MozCon
                  • Webinars
                  • Practical Marketer Series
                  • MozPod
                  Connect with us

                  Contact the Help team

                  Join our newsletter

                  Access all your tools in one place. Whether you're tracking progress or analyzing data, everything you need is at your fingertips.

                  Moz logo
                  © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                  • Accessibility
                  • Terms of Use
                  • Privacy

                  Looks like your connection to Moz was lost, please wait while we try to reconnect.