undefined
Skip to content
Moz logo Menu open Menu close
  • Products
    • Moz Pro
    • Moz Pro Home
    • Moz Local
    • Moz Local Home
    • STAT
    • Moz API
    • Moz API Home
    • Compare SEO Products
    • Moz Data
  • Free SEO Tools
    • Domain Analysis
    • Keyword Explorer
    • Link Explorer
    • Competitive Research
    • MozBar
    • More Free SEO Tools
  • Learn SEO
    • Beginner's Guide to SEO
    • SEO Learning Center
    • Moz Academy
    • SEO Q&A
    • Webinars, Whitepapers, & Guides
  • Blog
  • Why Moz
    • Agency Solutions
    • Enterprise Solutions
    • Small Business Solutions
    • Case Studies
    • The Moz Story
    • New Releases
  • Log in
  • Log out
  • Products
    • Moz Pro

      Your all-in-one suite of SEO essentials.

    • Moz Local

      Raise your local SEO visibility with complete local SEO management.

    • STAT

      SERP tracking and analytics for enterprise SEO experts.

    • Moz API

      Power your SEO with our index of over 44 trillion links.

    • Compare SEO Products

      See which Moz SEO solution best meets your business needs.

    • Moz Data

      Power your SEO strategy & AI models with custom data solutions.

    NEW Keyword Suggestions by Topic
    Moz Pro

    NEW Keyword Suggestions by Topic

    Learn more
  • Free SEO Tools
    • Domain Analysis

      Get top competitive SEO metrics like DA, top pages and more.

    • Keyword Explorer

      Find traffic-driving keywords with our 1.25 billion+ keyword index.

    • Link Explorer

      Explore over 40 trillion links for powerful backlink data.

    • Competitive Research

      Uncover valuable insights on your organic search competitors.

    • MozBar

      See top SEO metrics for free as you browse the web.

    • More Free SEO Tools

      Explore all the free SEO tools Moz has to offer.

    NEW Keyword Suggestions by Topic
    Moz Pro

    NEW Keyword Suggestions by Topic

    Learn more
  • Learn SEO
    • Beginner's Guide to SEO

      The #1 most popular introduction to SEO, trusted by millions.

    • SEO Learning Center

      Broaden your knowledge with SEO resources for all skill levels.

    • On-Demand Webinars

      Learn modern SEO best practices from industry experts.

    • How-To Guides

      Step-by-step guides to search success from the authority on SEO.

    • Moz Academy

      Upskill and get certified with on-demand courses & certifications.

    • SEO Q&A

      Insights & discussions from an SEO community of 500,000+.

    Unlock flexible pricing & new endpoints
    Moz API

    Unlock flexible pricing & new endpoints

    Find your plan
  • Blog
  • Why Moz
    • Small Business Solutions

      Uncover insights to make smarter marketing decisions in less time.

    • Agency Solutions

      Earn & keep valuable clients with unparalleled data & insights.

    • Enterprise Solutions

      Gain a competitive edge in the ever-changing world of search.

    • The Moz Story

      Moz was the first & remains the most trusted SEO company.

    • Case Studies

      Explore how Moz drives ROI with a proven track record of success.

    • New Releases

      Get the scoop on the latest and greatest from Moz.

    Surface actionable competitive intel
    New Feature

    Surface actionable competitive intel

    Learn More
  • Log in
    • Moz Pro
    • Moz Local
    • Moz Local Dashboard
    • Moz API
    • Moz API Dashboard
    • Moz Academy
  • Avatar
    • Moz Home
    • Notifications
    • Account & Billing
    • Manage Users
    • Community Profile
    • My Q&A
    • My Videos
    • Log Out

The Moz Q&A Forum

  • Forum
  • Questions
  • Users
  • Ask the Community

Welcome to the Q&A Forum

Browse the forum for helpful insights and fresh discussions about all things SEO.

  1. Home
  2. SEO Tactics
  3. Intermediate & Advanced SEO
  4. Block an entire subdomain with robots.txt?

Moz Q&A is closed.

After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Block an entire subdomain with robots.txt?

Intermediate & Advanced SEO
5
16
111.6k
Loading More Posts
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as question
Log in to reply
This topic has been deleted. Only users with question management privileges can see it.
  • kylesuss
    kylesuss last edited by Aug 18, 2011, 2:00 PM

    Is it possible to block an entire subdomain with robots.txt?

    I write for a blog that has their root domain as well as a subdomain pointing to the exact same IP. Getting rid of the option is not an option so I'd like to explore other options to avoid duplicate content. Any ideas?

    1 Reply Last reply Reply Quote 12
    • kylesuss
      kylesuss @kylesuss last edited by Sep 2, 2011, 12:28 PM Sep 2, 2011, 12:28 PM

      Awesome! That did the trick -- thanks for your help. The site is no longer listed 🙂

      1 Reply Last reply Reply Quote 1
      • sprynewmedia
        sprynewmedia @kylesuss last edited by Aug 28, 2011, 3:21 PM Aug 27, 2011, 3:11 PM

        Fact is, the robots file alone will never work (the link has a good explanation why - short form: all it does is stop the bots from indexing again).

        Best to request removal then wait a few days.

        1 Reply Last reply Reply Quote 3
        • kylesuss
          kylesuss @kylesuss last edited by Aug 27, 2011, 2:04 PM Aug 27, 2011, 2:04 PM

          Yeah. As of yet, the site has not been de-indexed. We placed the conditional rule in htaccess and are getting different robots.txt files for the domain and subdomain -- so that works. But I've never done this before so I don't know how long it's supposed to take?

          I'll try to verify via Webmaster Tools to speed up the process. Thanks

          1 Reply Last reply Reply Quote 0
          • sprynewmedia
            sprynewmedia @kylesuss last edited by Aug 27, 2011, 2:29 PM Aug 27, 2011, 2:00 PM

            You should do a remove request in Google Webmaster Tools.  You have to first verify the sub-domain then request the removal.

            See this post on why the robots file alone won't work...

            http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts

            1 Reply Last reply Reply Quote 1
            • kylesuss
              kylesuss @kylesuss last edited by Aug 22, 2011, 12:34 PM Aug 22, 2011, 12:33 PM

              Awesome. We used your second idea and so far it looks like it is working exactly how we want. Thanks for the idea.

              Will report back to confirm that the subdomain has been de-indexed.

              1 Reply Last reply Reply Quote 0
              • sprynewmedia
                sprynewmedia @kylesuss last edited by Aug 18, 2011, 8:31 PM Aug 18, 2011, 8:31 PM

                Option 1 could come with a small performance hit if you have a lot of txt files being used on the server.

                There shouldn't be any negative side effects to option 2 if the rewrite is clean (IE not accidently a redirect) and the content of the two files are robots compliant.

                Good luck

                1 Reply Last reply Reply Quote 2
                • kylesuss
                  kylesuss @sprynewmedia last edited by Aug 18, 2011, 8:14 PM Aug 18, 2011, 8:14 PM

                  Thanks for the suggestion. I'll definitely have to do a bit more research into this one to make sure that it doesn't have any negative side effects before implementation

                  sprynewmedia kylesuss 6 Replies Last reply Sep 2, 2011, 12:28 PM Reply Quote 0
                  • kylesuss
                    kylesuss @john4math last edited by Aug 18, 2011, 8:12 PM Aug 18, 2011, 8:12 PM

                    We have a plugin right now that places canonical tags, but unfortunately, the canonical for the subdomain points to the subdomain. I'll look around to see if I can tweak the settings

                    1 Reply Last reply Reply Quote 0
                    • sprynewmedia
                      sprynewmedia last edited by Aug 18, 2011, 4:20 PM Aug 18, 2011, 3:44 PM

                      Sounds like (from other discussions) you may be stuck requiring a dynamic robot.txt file which detects what domain the bot is on and changes the content accordingly.  This means the server has to run all .txt file as (I presume) PHP.

                      Or, you could conditionally rewrite the /robot.txt URL to a new file according to sub-domain

                      RewriteEngine on
                      RewriteCond %{HTTP_HOST} ^subdomain.website.com$
                      RewriteRule ^robotx.txt$ robots-subdomain.txt

                      Then add:

                      User-agent: *
                      Disallow: /

                      to the robots-subdomain.txt file

                      (untested)

                      kylesuss 1 Reply Last reply Aug 18, 2011, 8:14 PM Reply Quote 2
                      • john4math
                        john4math last edited by Aug 18, 2011, 3:17 PM Aug 18, 2011, 3:17 PM

                        Placing canonical tags isn't an option?  Detect that the page is being viewed through the subdomain, and if so, write the canonical tag on the page back to the root domain?

                        Or, just place a canonical tag on every page pointing back to the root domain (so the subdomain and root domain pages would both have them).  Apparently, it's ok to have a canonical tag on a page pointing to itself.  I haven't tried this, but if Matt Cutts says it's ok...

                        kylesuss 1 Reply Last reply Aug 18, 2011, 8:12 PM Reply Quote 1
                        • kylesuss
                          kylesuss @AdoptionHelp last edited by Aug 18, 2011, 3:06 PM Aug 18, 2011, 3:06 PM

                          Hey Ryan,

                          I wasn't directly involved with the decision to create the subdomain, but I'm told that it is necessary to create in order to bypass certain elements that were affecting the root domain.

                          Nevertheless, it is a blog and the users now need to login to the subdomain in order to access the Wordpress backend to bypass those elements. Traffic for the site still goes to the root domain.

                          1 Reply Last reply Reply Quote 0
                          • AdoptionHelp
                            AdoptionHelp last edited by Aug 18, 2011, 2:34 PM Aug 18, 2011, 2:34 PM

                            They both point to the same location on the server? So there's not a different folder for the subdomain?

                            If that's the case then I suggest adding a rule to your htaccess file to 301 the subdomain back to the main domain in exactly the same way people redirect from non-www to www or vice-versa. However, you should ask why the server is configured to have a duplicate subdomain? You might just edit your apache settings to get rid of that subdomain (usually done through a cpanel interface).

                            Here is what your htaccess might look like:

                            <ifmodule mod_rewrite.c="">RewriteEngine on
                              # Redirect non-www to wwww
                              RewriteCond %{HTTP_HOST} !^www.mydomain.org [NC]
                              RewriteRule ^(.*)$ http://www.mydomain.org/$1 [R=301,L]</ifmodule>

                            kylesuss 1 Reply Last reply Aug 18, 2011, 3:06 PM Reply Quote 2
                            • AndyKuiper
                              AndyKuiper last edited by Aug 18, 2011, 2:24 PM Aug 18, 2011, 2:24 PM

                              Not to me LOL 🙂 I think you'll need someone with a bit more expertise in this area than I to assist in this case. Kyle, I'm sorry I couldn't offer more assistance... but I don't want to tell you something if I'm not 100% sure. I suspect one of the many bright SEOmozer's will quickly come to the rescue on this one.

                              Andy 🙂

                              1 Reply Last reply Reply Quote 1
                              • kylesuss
                                kylesuss @AndyKuiper last edited by Aug 18, 2011, 2:15 PM Aug 18, 2011, 2:15 PM

                                Hey Andy,

                                Herein lies the problem. Since the domain and subdomain point to the exact same place, they both utilize the same robots.txt file.

                                Does that make sense?

                                1 Reply Last reply Reply Quote 0
                                • AndyKuiper
                                  AndyKuiper last edited by Aug 18, 2011, 2:14 PM Aug 18, 2011, 2:14 PM

                                  Hi Kyle 🙂 Yes, you can block an entire subdomain via robots.txt, however you'll need to create a robots.txt file and place it in the root of the subdomain, then add the code to direct the bots to stay away from the entire subdomain's content.

                                  User-agent: *
                                  Disallow: /

                                  hope this helps 🙂

                                  kylesuss 1 Reply Last reply Aug 18, 2011, 2:15 PM Reply Quote 3
                                  • 1 / 1
                                  1 out of 16
                                  • First post
                                    1/16
                                    Last post

                                  Got a burning SEO question?

                                  Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.


                                  Start my free trial


                                  Browse Questions

                                  Explore more categories

                                  • Moz Tools

                                    Chat with the community about the Moz tools.

                                  • SEO Tactics

                                    Discuss the SEO process with fellow marketers

                                  • Community

                                    Discuss industry events, jobs, and news!

                                  • Digital Marketing

                                    Chat about tactics outside of SEO

                                  • Research & Trends

                                    Dive into research and trends in the search industry.

                                  • Support

                                    Connect on product support and feature requests.

                                  • See all categories

                                  Related Questions

                                  • vetofunk

                                    Robots.txt & Disallow: /*? Question!

                                    Hi, I have a site where they have: Disallow: /*? Problem is we need the following indexed: ?utm_source=google_shopping What would the best solution be? I have read: User-agent: *
                                    Allow: ?utm_source=google_shopping
                                    Disallow: /*? Any ideas?

                                    Intermediate & Advanced SEO | Mar 30, 2020, 5:48 AM | vetofunk
                                    0
                                  • Markbwc

                                    New Subdomain & Best Way To Index

                                    We have an ecommerce site, we'll say at https://example.com. We have created a series of brand new landing pages, mainly for PPC and Social at https://sub.example.com, but would also like for these to get indexed. These are built on Unbounce so there is an easy option to simply uncheck the box that says "block page from search engines", however I am trying to speed up this process but also do this the best/correct way. I've read a lot about how we should build landing pages as a sub-directory, but one of the main issues we are dealing with is long page load time on https://example.com, so I wanted a kind of fresh start. I was thinking a potential solution to index these quickly/correctly was to make a redirect such as https://example.com/forward-1 -> https:sub.example.com/forward-1 then submit https://example.com/forward-1 to Search Console but I am not sure if that will even work. Another possible solution was to put some of the subdomain links accessed on the root domain say right on the pages or in the navigation. Also, will I definitely be hurt by 'starting over' with a new website? Even though my MozBar on my subdomain https://sub.example.com has the same domain authority (DA) as the root domain https://example.com? Recommendations and steps to be taken are welcome!

                                    Intermediate & Advanced SEO | Feb 23, 2019, 7:58 PM | Markbwc
                                    0
                                  • jmorehouse

                                    Should I disallow all URL query strings/parameters in Robots.txt?

                                    Webmaster Tools correctly identifies the query strings/parameters used in my URLs, but still reports duplicate title tags and meta descriptions for the original URL and the versions with parameters. For example, Webmaster Tools would report duplicates for the following URLs, despite it correctly identifying the "cat_id" and "kw" parameters: /Mulligan-Practitioner-CD-ROM
                                    /Mulligan-Practitioner-CD-ROM?cat_id=87
                                    /Mulligan-Practitioner-CD-ROM?kw=CROM Additionally, theses pages have self-referential canonical tags, so I would think I'd be covered, but I recently read that another Mozzer saw a great improvement after disallowing all query/parameter URLs, despite Webmaster Tools not reporting any errors. As I see it, I have two options: Manually tell Google that these parameters have no effect on page content via the URL Parameters section in Webmaster Tools (in case Google is unable to automatically detect this, and I am being penalized as a result). Add "Disallow: *?" to hide all query/parameter URLs from Google. My concern here is that most backlinks include the parameters, and in some cases these parameter URLs outrank the original. Any thoughts?

                                    Intermediate & Advanced SEO | Jul 7, 2015, 9:05 AM | jmorehouse
                                    0
                                  • jmorehouse

                                    Disallow URLs ENDING with certain values in robots.txt?

                                    Is there any way to disallow URLs ending in a certain value? For example, if I have the following product page URL: http://website.com/category/product1, and I want to disallow /category/product1/review, /category/product2/review, etc. without disallowing the product pages themselves, is there any shortcut to do this, or must I disallow each gallery page individually?

                                    Intermediate & Advanced SEO | Jun 15, 2015, 4:01 PM | jmorehouse
                                    0
                                  • MarkWill

                                    Blog subdomain not redirecting

                                    Over the last few weeks I have been focused on fixing high and medium priority issues, as reported by the Moz crawler, after a recent transition to WordPress. I've made great progress, getting the high priority issues down from several hundred (various reasons, but many duplicates for things like non-www and www versions) to just five last week. And then there's this weeks report. For reasons I can't fathom, I am suddenly getting hundreds of duplicate content pages of the form http://blog.<domain>.com</domain> (being duplicates with the http://www.<domain>.com</domain> versions). I'm really unclear on why these suddenly appeared. I host my own WordPress site ie WordPress.org stuff. In Options / General everything refers to http://www.<domain>.com</domain> and has done for a number of weeks. I have no idea why the blog versions of the pages have suddenly appeared. FWIW, the non-www version of my pages still redirect to the www version, as I would expect. I'm obviously pretty concerned by this so any pointers greatly appreciated. Thanks. Mark

                                    Intermediate & Advanced SEO | May 13, 2015, 2:36 PM | MarkWill
                                    0
                                  • Blink-SEO

                                    Turning off a subdomain

                                    Hi! I'm currently working with http://www.muchbetteradventures.com/. They have a previous version of the site, http://v1.muchbetteradventures.com, as sub domain on their site. I've noticed a whole bunch of indexing issues which I think are caused by this. The v1 site has several thousand pages and ranks organically for a number of terms, but the pages are not relevant for the business at this time. The main site has just over 100 pages. More than 28,400 urls are currently indexed. We are considering turning off the v1 site and noindexing it. There are no real backlinks to it. The only worry is that by removing it, it will be seen as a massive drop in content. Rankings for the main site are currently quite poor, despite good content, a decent link profile and high domain authority. Any thoughts would be much appreciated!

                                    Intermediate & Advanced SEO | Jul 18, 2013, 11:01 AM | Blink-SEO
                                    0
                                  • djlaidler

                                    How do I list the subdomains of a domain?

                                    Hi Mozers, I am trying to find what subdomains are currently active on a particular domain. Is there a way to get a list of this information? The only way I could think of doing it is to run a google search on; site:example.com -site:www.example.com The only issues with this approach is that a majority of the indexed pages exist on the non-www domain and I still have thousands of pages in the results (mainly from the non-www). Is there another way to do it in Google? OR is there a server admin online tool that will tell me this information? Cheers, Dan

                                    Intermediate & Advanced SEO | Mar 26, 2013, 10:08 PM | djlaidler
                                    0
                                  • AndrewY

                                    Blocking Dynamic URLs with Robots.txt

                                    Background: My e-commerce site uses a lot of layered navigation and sorting links.  While this is great for users, it ends up in a lot of URL variations of the same page being crawled by Google.  For example, a standard category page: www.mysite.com/widgets.html ...which uses a "Price" layered navigation sidebar to filter products based on price also produces the following URLs which link to the same page: http://www.mysite.com/widgets.html?price=1%2C250 http://www.mysite.com/widgets.html?price=2%2C250 http://www.mysite.com/widgets.html?price=3%2C250 As there are literally thousands of these URL variations being indexed, so I'd like to use Robots.txt to disallow these variations. Question: Is this a wise thing to do?  Or does Google take into account layered navigation links by default, and I don't need to worry. To implement, I was going to do the following in Robots.txt: User-agent: * Disallow: /*? Disallow: /*= ....which would prevent any dynamic URL with a '?" or '=' from being indexed.  Is there a better way to do this, or is this a good solution? Thank you!

                                    Intermediate & Advanced SEO | May 12, 2011, 2:55 PM | AndrewY
                                    1

                                  Get started with Moz Pro!

                                  Unlock the power of advanced SEO tools and data-driven insights.

                                  Start my free trial
                                  Products
                                  • Moz Pro
                                  • Moz Local
                                  • Moz API
                                  • Moz Data
                                  • STAT
                                  • Product Updates
                                  Moz Solutions
                                  • SMB Solutions
                                  • Agency Solutions
                                  • Enterprise Solutions
                                  Free SEO Tools
                                  • Domain Authority Checker
                                  • Link Explorer
                                  • Keyword Explorer
                                  • Competitive Research
                                  • Brand Authority Checker
                                  • Local Citation Checker
                                  • MozBar Extension
                                  • MozCast
                                  Resources
                                  • Blog
                                  • SEO Learning Center
                                  • Help Hub
                                  • Beginner's Guide to SEO
                                  • How-to Guides
                                  • Moz Academy
                                  • API Docs
                                  About Moz
                                  • About
                                  • Team
                                  • Careers
                                  • Contact
                                  Why Moz
                                  • Case Studies
                                  • Testimonials
                                  Get Involved
                                  • Become an Affiliate
                                  • MozCon
                                  • Webinars
                                  • Practical Marketer Series
                                  • MozPod
                                  Connect with us

                                  Contact the Help team

                                  Join our newsletter

                                  Access all your tools in one place. Whether you're tracking progress or analyzing data, everything you need is at your fingertips.

                                  Moz logo
                                  © 2021 - 2025 SEOMoz, Inc., a Ziff Davis company. All rights reserved. Moz is a registered trademark of SEOMoz, Inc.
                                  • Accessibility
                                  • Terms of Use
                                  • Privacy

                                  Looks like your connection to Moz was lost, please wait while we try to reconnect.