Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do I need a separate robots.txt file for my shop subdomain?
-
Hello Mozzers!
Apologies if this question has been asked before, but I couldn't find an answer so here goes...
Currently I have one robots.txt file hosted at https://www.mysitename.org.uk/robots.txt
We host our shop on a separate subdomain https://shop.mysitename.org.uk
Do I need a separate robots.txt file for my subdomain? (Some Google searches are telling me yes and some no and I've become awfully confused!
-
Thank you. I want to disallow specific URLs on the subdomain and add the shop sitemap in the robots.txt file. So I'll go ahead and create another!
-
You go be fine without one. You only need one if you want to manage that subdmain: add specific xml sitemaps links in robots.txt, cut access to specific folders for that subdomain.
if you don't need any of that - just move forward without one.
-
Currently we just have: User-agent: *
I'm in the process of optimising.
-
It depends what currently is in your robots.txt. Usually it would be useful to have another one for your subdomain.
-
Yes, I would have a seperate robots.txt files.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When should a variant be a variant and when should it be a separate product from an SEO POV?
Hi all, We are looking at changing our current e-commerce store to a new platform and in doing so thinking of making some changes to how we list products in sub-categories. We have seen related questions asking about splitting a single product into multiple products to rank for different terms, but we are wondering about combining multiple products into a single product page? The examples we have seen have been about fashion items with variants of colour and size. However, the products we sell have variances that change the appearance, dimensions and technical specification, so we would like to ask the MOZ community if combining products with these variances would still be deemed good practice? We sell wood burning stoves and a good example of a product that we are considering combining is the Scan 85 stove, which is available in eight different configurations: 85-1, 85-2, 85-3 etc. Scan themselves refer to each version as a separate product and they are bought, stocked and sold as separate products. Wood burning stoves like this typically have a firebox in the centre and then design options that can change the top, side, base, door, colour and fuel. In this example, the firebox is the Scan 85 and the variation is the last number, each of which corresponds to a different design option changing both the appearance and dimensions (see attached image). We have them listed as eight different products on our current site, one for each version. Primarily because each option has its own name (albeit 1-digit difference) which when we created the pages we thought that more pages would present us with more ranking opportunity. However, we have since learnt that because these eight pages are all so similar and it is difficult to write unique content about each product (with the 85-1 and 85-2 the only difference between the models are the black trim on the 85-1 and the silver trim on 85-2). Especially as when talking about the firebox itself, how well the fire burns, how controllable it is etc, will be the same for all versions. Likewise, earning backlinks to eight separate pages is also very difficult. Exploring this lead, us to the question, when is a variant a variant and when is it a separate product? Are there hard and fast rules for what defines variants and products? Or does it simply vary from industry to industry product to product, and if so should we be looking at it from a UX or SEO POV, when making that decision? Our hope is that if we combine these eight products into a single high-quality page, it will present us with a greater ranking opportunity for that one page over eight individual pages. We also hope that in doing so will allow us to create a more intuitive UX on a single page with a unique description, more reviews focused on one page and an explanation of the options available, all of which should lead to more conversions. Finally, by creating a better UX and unique detailed description we hope that there is a higher chance of us earning product level backlinks then we do with eight lower quality pages. One of the issues in creating a single product page for all the variants is the sub-category/results pages, as we would be removing eight simple products and replacing them with one complex product. We have questions over how this would work from a filter/facet level whereby when you apply a filter there is an expectation that the image shown will match the criteria, so if we filter for stoves with a silver trim for example, there is an expectation to only see stoves that have a silver trim in the results. When you have separate product pages you have separate listings which makes this easier to only bring back the models matching the criteria. However, when you have a single page this is more complex as you will need a default image for non-filtered results and then the ability to assign an image to lots of different attributes so that the correct image is always shown that matches the criteria selected. All of which we have been assured is do-able but adds an extra level of complexity to the process from an admin side. The alternative to doing this would be to create eight simple/child products and link them to one configurable/parent product. We could them list the simple products into the results pages and have them all linking back to the main configurable product which could load with the options of the simple product that was selected. From an SEO POV this brings in some more work, redirecting each page to the parent, but ultimately this could provide a better UX and might be the better solution. Has anyone got any experience in doing either of these options before? Both options above with affect the number of products we have available, so does the number of products in a sub-category effect the ability for that category page to rank? We currently have around 500 products in our wood burning stoves category, with perhaps an additional 300 to add. If we go down the combining into a single product page route this will reduce the number of products by around a third. If we keep all the simple/child products, then this will stay around the same. So, have we missed something obvious? Is there a glaring issue that we have overlooked from an SEO point of view as well as from the customer experience? We would appreciate your thoughts on this. Thanks, Reece scan85-1.jpg
Technical SEO | | fireproductsuk0 -
Is there any benefit in using a subdomain redirected to a single page?
For example if we have a domain www.bobshardware.com.au and we setup a subdomain sydneysupplies.bobshardware.com.au and then brisbanescrewdrivers.bobshardware.com.au and used those in ad campaigns. Each subdomain being redirected back to a single page such as bobshardware.com.au/brisbane-screw-drivers etc. Is there a benefit ? Cheers
Technical SEO | | techdesign0 -
Google indexing despite robots.txt block
Hi This subdomain has about 4'000 URLs indexed in Google, although it's blocked via robots.txt: https://www.google.com/search?safe=off&q=site%3Awww1.swisscom.ch&oq=site%3Awww1.swisscom.ch This has been the case for almost a year now, and it does not look like Google tends to respect the blocking in http://www1.swisscom.ch/robots.txt Any clues why this is or what I could do to resolve it? Thanks!
Technical SEO | | zeepartner0 -
Are robots.txt wildcards still valid? If so, what is the proper syntax for setting this up?
I've got several URL's that I need to disallow in my robots.txt file. For example, I've got several documents that I don't want indexed and filters that are getting flagged as duplicate content. Rather than typing in thousands of URL's I was hoping that wildcards were still valid.
Technical SEO | | mkhGT0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Empty Meta Robots Directive - Harmful?
Hi, We had a coding update and a side-effect of that was that our directive was emptied, in other words it now reads as: on all of the site. I've since noticed that Google's cache date on all of the pages - at least, the ones I tested - have a Cached date of no later than 17 December '12 - that's the Monday after the directive was removed on mass. So, A, does anyone have solid evidence of an empty directive causing problems? Past experience, Matt Cutts, Fishkin quote, etc. And then B - It seems fairly well correlated but, does my entire site's homogenous Cached date point to this tag removal? Or is it fairly normal to have a particular cache date across a large site (we're a large ecommerce site). Our site: http://www.zando.co.za/ I'm having the directive reinstated as soon as Dev permitting. And then, for extra credit, is there a way with Google's API, or perhaps some other tool, to run an arbitrary list and retrieve Cached dates? I'd want to do this for diagnosis purposes and preferably in a way that OK with Google. I'd avoid CURLing for the cached URL and scraping out that dates with BASH, or any such kind of thing. Cheers,
Technical SEO | | RocketZando0 -
How can I find my Webmaster Tools HTML file?
So, totally amateur hour here, but I can't for the life of me find our HTML verification file for webmaster tools. I see nowhere to look at it in Google Webmaster Tools console, I tried a site:, I googled it, all the info out there is about how to verify a site. Ours is verified, but I need the verification file code to sync up with the Google API and no one seems to have it. Any thoughts?
Technical SEO | | healthgrades0 -
Allow or Disallow First in Robots.txt
If I want to override a Disallow directive in robots.txt with an Allow command, do I have the Allow command before or after the Disallow command? example: Allow: /models/ford///page* Disallow: /models////page
Technical SEO | | irvingw0