undefined
Skip to content
Moz logo Menu open Menu close
  • Products
    • Moz Pro
    • Moz Pro Home
    • Moz Local
    • Moz Local Home
    • STAT
    • Moz API
    • Moz API Home
    • Compare SEO Products
    • Moz Data
  • Free SEO Tools
    • Domain Analysis
    • Keyword Explorer
    • Link Explorer
    • Competitive Research
    • MozBar
    • More Free SEO Tools
  • Learn SEO
    • Beginner's Guide to SEO
    • SEO Learning Center
    • Moz Academy
    • SEO Q&A
    • Webinars, Whitepapers, & Guides
  • Blog
  • Why Moz
    • Agency Solutions
    • Enterprise Solutions
    • Small Business Solutions
    • Case Studies
    • The Moz Story
    • New Releases
  • Log in
  • Log out
  • Products
    • Moz Pro

      Your all-in-one suite of SEO essentials.

    • Moz Local

      Raise your local SEO visibility with complete local SEO management.

    • STAT

      SERP tracking and analytics for enterprise SEO experts.

    • Moz API

      Power your SEO with our index of over 44 trillion links.

    • Compare SEO Products

      See which Moz SEO solution best meets your business needs.

    • Moz Data

      Power your SEO strategy & AI models with custom data solutions.

    NEW Keyword Suggestions by Topic
    Moz Pro

    NEW Keyword Suggestions by Topic

    Learn more
  • Free SEO Tools
    • Domain Analysis

      Get top competitive SEO metrics like DA, top pages and more.

    • Keyword Explorer

      Find traffic-driving keywords with our 1.25 billion+ keyword index.

    • Link Explorer

      Explore over 40 trillion links for powerful backlink data.

    • Competitive Research

      Uncover valuable insights on your organic search competitors.

    • MozBar

      See top SEO metrics for free as you browse the web.

    • More Free SEO Tools

      Explore all the free SEO tools Moz has to offer.

    NEW Keyword Suggestions by Topic
    Moz Pro

    NEW Keyword Suggestions by Topic

    Learn more
  • Learn SEO
    • Beginner's Guide to SEO

      The #1 most popular introduction to SEO, trusted by millions.

    • SEO Learning Center

      Broaden your knowledge with SEO resources for all skill levels.

    • On-Demand Webinars

      Learn modern SEO best practices from industry experts.

    • How-To Guides

      Step-by-step guides to search success from the authority on SEO.

    • Moz Academy

      Upskill and get certified with on-demand courses & certifications.

    • SEO Q&A

      Insights & discussions from an SEO community of 500,000+.

    Unlock flexible pricing & new endpoints
    Moz API

    Unlock flexible pricing & new endpoints

    Find your plan
  • Blog
  • Why Moz
    • Small Business Solutions

      Uncover insights to make smarter marketing decisions in less time.

    • Agency Solutions

      Earn & keep valuable clients with unparalleled data & insights.

    • Enterprise Solutions

      Gain a competitive edge in the ever-changing world of search.

    • The Moz Story

      Moz was the first & remains the most trusted SEO company.

    • Case Studies

      Explore how Moz drives ROI with a proven track record of success.

    • New Releases

      Get the scoop on the latest and greatest from Moz.

    Surface actionable competitive intel
    New Feature

    Surface actionable competitive intel

    Learn More
  • Log in
    • Moz Pro
    • Moz Local
    • Moz Local Dashboard
    • Moz API
    • Moz API Dashboard
    • Moz Academy
  • Avatar
    • Moz Home
    • Notifications
    • Account & Billing
    • Manage Users
    • Community Profile
    • My Q&A
    • My Videos
    • Log Out

The Moz Q&A Forum

  • Forum
  • Questions
  • Users
  • Ask the Community

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

  1. Home
  2. SEO Tactics
  3. Technical SEO
  4. Robot.txt : How to block a specific file type in several subdirectories ?

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Robot.txt : How to block a specific file type in several subdirectories ?

Technical SEO
2
3
1.9k
Loading More Posts
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as question
Log in to reply
This topic has been deleted. Only users with question management privileges can see it.
  • LabeliumUSA
    LabeliumUSA last edited by Oct 31, 2017, 1:07 AM

    Hello everyone !

    I need help setting up a robot.txt.

    I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site.

    Block files of a specific file type (for example, .gif) | Disallow: /*.gif$

    2 questions :

    • Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ?

    Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$

    • Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files.

    Let's say I want to block pdf files in all these 3 directories

    /fileadmin/directory1

    /fileadmin/directory1/sub1

    /fileadmin/directory1/sub1/pdf

    Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple :

    Disallow: /fileadmin/directory1*/

    Many thanks in advance for any insight you may have.

    1 Reply Last reply Reply Quote 0
    • LabeliumUSA
      LabeliumUSA @Rajesh.Prajapati last edited by Nov 2, 2017, 8:20 PM Nov 2, 2017, 8:20 PM

      Hey thank you for your answer, really appreciate it.

      1 Reply Last reply Reply Quote 0
      • Rajesh.Prajapati
        Rajesh.Prajapati last edited by Oct 31, 2017, 10:10 AM Oct 31, 2017, 10:04 AM

        Use this code -
        Disallow: /*.f$
        If you want to block only one folder then use this -
        Disallow: /folder1/
        .*f$
        This rule will help to block both files only .pdf and .gif

        LabeliumUSA 1 Reply Last reply Nov 2, 2017, 8:20 PM Reply Quote 1
        • 1 / 1
        1 out of 3
        • First post
          1/3
          Last post

        Got a burning SEO question?

        Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


        Start my free trial


        Browse Questions

        Explore more categories

        • Moz Tools

          Chat with the community about the Moz tools.

        • SEO Tactics

          Discuss the SEO process with fellow marketers

        • Community

          Discuss industry events, jobs, and news!

        • Digital Marketing

          Chat about tactics outside of SEO

        • Research & Trends

          Dive into research and trends in the search industry.

        • Support

          Connect on product support and feature requests.

        • See all categories

        Related Questions

        • ramb

          I have two robots.txt pages for www and non-www version. Will that be a problem?

          There are two robots.txt pages. One for www version and another for non-www version though I have moved to the non-www version.

          Technical SEO | May 27, 2019, 10:29 PM | ramb
          0
        • rijwielcashencarry040

          Good robots txt for magento

          Dear Communtiy, I am trying to improve the SEO ratings for my website www.rijwielcashencarry.nl (magento). My next step will be implementing robots txt to exclude some crawling pages.
          Does anybody have a good magento robots txt for me? And what need i copy exactly? Thanks everybody! Greetings, Bob

          Technical SEO | Jan 13, 2017, 3:29 PM | rijwielcashencarry040
          0
        • zeepartner

          Robots.txt on http vs. https

          We recently changed our domain from http to https. When a user enters any URL on http, there is an global 301 redirect to the same page on https. I cannot find instructions about what to do with robots.txt. Now that https is the canonical version, should I block the http-Version with robots.txt? Strangely, I cannot find a single ressource about this...

          Technical SEO | Feb 5, 2015, 3:18 PM | zeepartner
          0
        • MickEdwards

          Adding multi-language sitemaps to robots.txt

          I am working on a revamped multi-language site that has moved to Magento.  Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts.  They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it.  Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?

          Technical SEO | Sep 17, 2013, 11:23 AM | MickEdwards
          0
        • Dan-Lawrence

          Creating a CSV file for uploading 301 redirect URL map

          Hi if i'm bulk uploading 301 redirects whats needed to create a csv file? is it just a case of creating an excel spreadsheet & have the old urls in column A and new urls in column B and then just convert to csv and upload ? or do i need to put in other details or paremeters etc etc ? Cheers Dan

          Technical SEO | Apr 30, 2013, 11:24 AM | Dan-Lawrence
          0
        • lzhao

          Temporarily suspend Googlebot without blocking users

          We'll soon be launching a redesign, on a new platform, migrating millions of pages to new URLs. How can I tell Google (and other crawlers) to temporarily (a day or two) ignore my site?  We're hoping to buy ourselves a small bit of time to verify redirects and live functionality before allowing Google to crawl and index the new architecture. GWT's recommendation is to 503 all pages - including robots.txt, but that also makes the site invisible to real site visitors, resulting in significant business loss.  Bad answer. I've heard some recommendations to disallow all user agents in robots.txt.   Any answer that puts the millions of pages we already have indexed at risk is also a bad answer. Thanks

          Technical SEO | Aug 21, 2012, 11:52 AM | lzhao
          0
        • aethereal

          Blocking URL's with specific parameters from Googlebot

          Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
          Asim

          Technical SEO | Oct 3, 2011, 11:57 PM | aethereal
          0
        • nicole.healthline

          Is blocking RSS Feeds with robots.txt necessary?

          Is it necessary to block an rss feed with robots.txt? It seems they are automatically not indexed (http://googlewebmastercentral.blogspot.com/2007/12/taking-feeds-out-of-our-web-search.html) And, google says here that it's important not to block RSS feeds (http://googlewebmastercentral.blogspot.com/2009/10/using-rssatom-feeds-to-discover-new.html) I'm just checking!

          Technical SEO | Jul 9, 2011, 12:15 PM | nicole.healthline
          0

        Get started with Moz Pro!

        Unlock the power of advanced SEO tools and data-driven insights.

        Start my free trial
        Products
        • Moz Pro
        • Moz Local
        • Moz API
        • Moz Data
        • STAT
        • Product Updates
        Moz Solutions
        • SMB Solutions
        • Agency Solutions
        • Enterprise Solutions
        Free SEO Tools
        • Domain Authority Checker
        • Link Explorer
        • Keyword Explorer
        • Competitive Research
        • Brand Authority Checker
        • Local Citation Checker
        • MozBar Extension
        • MozCast
        Resources
        • Blog
        • SEO Learning Center
        • Help Hub
        • Beginner's Guide to SEO
        • How-to Guides
        • Moz Academy
        • API Docs
        About Moz
        • About
        • Team
        • Careers
        • Contact
        Why Moz
        • Case Studies
        • Testimonials
        Get Involved
        • Become an Affiliate
        • MozCon
        • Webinars
        • Practical Marketer Series
        • MozPod
        Connect with us

        Contact the Help team

        Join our newsletter
        Moz logo
        © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
        • Accessibility
        • Terms of Use
        • Privacy

        Looks like your connection to Moz was lost, please wait while we try to reconnect.