Help with Robots.txt On a Shared Root
-
Hi,
I posted a similar question last week asking about subdomains but a couple of complications have arisen.
Two different websites I am looking after share the same root domain which means that they will have to share the same robots.txt. Does anybody have suggestions to separate the two on the same file without complications? It's a tricky one.
Thank you in advance.
-
Okay so if you have one root domain you can only have one robots.txt file.
The reason I asked for an example is in the case there was something you could put in the robots.txt to differentiate the two.
For example if you have
thisdomain.com and thatdomain.com
However if "thatdomain.com" uses a folder called shop ("thatdomain.com/shop") than you could prefix all your robots.txt file entries with /shop provided that "thisdomain.com" doesn't use the folder shop, Then all the /shop entries would only be applicable to "thatdomain.com". Does this make sense?
Don
-
It's not so much that one is a subdomain, it's that they are as different as Google and Yahoo yet they share the same root. I wish I could show you but I can't because of confidentiality.
The 303 wasn't put in place by me, I would have strongly suggested another method. I think it was set up so that both websites could be controlled from the same login but it's opened a can of worms for SEO.
I don't want the two separate robots files, the developer insists it has to be that way.
-
Can you provide me an example of the way the domains look... Specifically where the root pages are.
Additionally, if you are redirecting 303 one of the domains to the other why do you want two different robots.txt files? The one being 303 will always redirect to the other...?
Depending on the structures you can create one robots.txt file that deals with 2 different domains provided there is something unique about the root folders.
-
Thanks for your help so far.
The two different websites are different name domains but share the same root as it's been built this way on Typo3. I don't know of the developer's justification for the 303, it's something I wish we could change.
I'm not sure if there are specific tags you can put in the sole robots.txt to differentiate the two, have read a few conflicting arguments about how to do it.
-
Okay so if you're using a 303 then you're saying the content you want for X site is actually located at Y site.Which means you do not have 2 different sub domains. So there is no need for 2 robots.txt files and your developer is correct you can't use 2 robots.txt files. Since one site would be pointing to the other you only have one sub-domain.
However, 303 is in general a poor way to use a redirect and likely should be 301.. but I would have to understand why the 303 is being used to say that with 100% certainty. See a quick article about 303 here..
Hope this answers the question,
Don
-
It's Fasthosts. The developer is certain that we can't use the two separate robots files. The second website has been set up on a 303.
-
What host are you using?
-
The developer of the website insists that they have to share the same robots.txt, I am really not sure how he's set it up this way. I am beyond befuddled with this!
-
The subdomain has to be separated from the root in some fashion. I would assume depending on your host that there is a separate folder for the subdomain stuff. Otherwise it would be chaos. Say you installed forums on your forum subdomain and a e-commerce on your shop subdomain... which index.php page would be served?
There has to be some separation, review your file manager and look for the sub-domain folders. Once found you simply put a robots.txt into each of those folders.
Hope this helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirection help needed!
Hi all, So if we used to have a domain (let's say olddomain.com) and we had a new site created at newdomain.com how do we properly setup redirects page to page. Caveat, the urls have changed so for instance the old page oldomain.com/service is now newdomain.com/our-services on the new site. Do we need to have hosting on the old site? Do we need to setup individual 301s for each page corresponding to the new page? Just looking for the easiest way to do this CORRECTLY. Thanks, Ricky
Intermediate & Advanced SEO | | RickyShockley3 -
Language Tunnel - Help!
Hi, First post here. A few months back (before they were my client), my client updated their site to include a language tunnel. It looks like some other updates were made as well to "prettify" the site's URLs. Unfortunately, after this update, lots of well-ranking landing pages are now completely gone with no redirects in place. Normally, I would just give them a list of these old pages and say "301 Redirect" to X page. However, as part of this site update, they added country code into the mix. So now, instead of just 6 or 7 languages, we are looking at 30-40 permutations of language and country (with some countries having multiple languages). The functionality of the new site is fine, but all of the old 404s are not being kind to the search engine traffic. My question is: what's the best way to resolve this problem? These old pages usually specify a language code (but no country code). So, for example, I am thinking of redirecting all of the Spanish 404 urls to a Spanish "country tunnel". However, this is obviously not the same as what we had before, where the actual pages were indexed. Since my old pages no longer exist and I've got this country problem now (to stand in the way of a straightforward redirect), is there any way to appease the SEO gods on this?
Intermediate & Advanced SEO | | navdm0 -
Rankings drop after https migration [Need urgent help]
hey hi guys, i have lost all my organic traffic and in need of urgent help. Plz i need all your SEO expertise. My website is: makemoneyadultcontent.com (before judging me, let me tell you that i have been in adult industry for almost 7 years now and hence this blog was a way to pass my knowledge to everyone who is looking for it) Recently i saw that moving to HTTPS will benefit my site. I did not knew much about technical details of moving to https. My hosting was on siteground, and they have a button which you can press and your website will then be served through https. So i did that. Nothing else was done. this was done on 13th or 14th of june For the next few weeks here is my analytics (image attached) : As you can see the traffic fluctuated and finally i lost everything on 29th june. When i researched more, i found out that you need to follow many steps in order to move to HTTPS. So i followed this guide and completed all the steps: https://www.keycdn.com/blog/http-to-https/ (also checked many other guides and tried to complete all the steps mentioned) Everyone said that the rankings will recover in one week, but now it has been more than 17 days and still the rankings have not improved. My articles are indexed as i can see it using site:makemoneyadultcontent.com but none of them are ranking for the keywords that i was ranking earlier. i have submitted the sitemap in google webmaster and around 166 (out of 218) pages are indexed. Here are few more images to help my case: plz plz help me what i can do to get my website traffic back. i have spent a lot of time and effort in building this site, cant see it die a slow death like this one. 8Wa8c [http ahref](http ahref) D42xw sAd9E
Intermediate & Advanced SEO | | akki910 -
Is my landing page "over-optimized"? Please help
Hello out there My website www.painterdublin.com and www.tilers-dublin.com were heavily hit by google panda update on 27.9.2012 and EMD update few days after. I lost about 70% of the traffic mainly from combination of the keywords from my domain name (painter dublin and tilers dublin) and never managed to recover from it. I am wondering if I should also concentrate on rewriting the content of both home landing pages in the terms of "KEYWORD DENSITY". Do you think my content is "OVER OPTIMIZED" for my main keywords? (painter dublin, tilers-dublin). What is the correct use? Is there any tool to guide me? I am aware I am using those terms quite often. I don't want to start deleting those terms before I know the right way to do it. Is there anybody willing to have a look at my sites and give me advice please? kind regards Jaro
Intermediate & Advanced SEO | | jarik0 -
Robots.txt file - How to block thosands of pages when you don't have a folder path
Hello.
Intermediate & Advanced SEO | | Unity
Just wondering if anyone has come across this and can tell me if it worked or not. Goal:
To block review pages Challenge:
The URLs aren't constructed using folders, they look like this:
www.website.com/default.aspx?z=review&PG1234
www.website.com/default.aspx?z=review&PG1235
www.website.com/default.aspx?z=review&PG1236 So the first part of the URL is the same (i.e. /default.aspx?z=review) and the unique part comes immediately after - so not as a folder. Looking at Google recommendations they show examples for ways to block 'folder directories' and 'individual pages' only. Question:
If I add the following to the Robots.txt file will it block all review pages? User-agent: *
Disallow: /default.aspx?z=review Much thanks,
Davinia0 -
Do Banner Ads Help In Link Building?
We have been contacting some webmasters for building links, but a lot of them will only do banner ads on there site. Is having a keyword rich alt tag on the banner ad and a do follow link to our site just as good? Would like to hear your thoughts and l experiences in trying to leverage these banner ads to help in seo ranking. Thank you in advance for your input!
Intermediate & Advanced SEO | | anchorwave0 -
Robots.txt disallow subdomain
Hi all, I have a development subdomain, which gets copied to the live domain. Because I don't want this dev domain to get crawled, I'd like to implement a robots.txt for this domain only. The problem is that I don't want this robots.txt to disallow the live domain. Is there a way to create a robots.txt for this development subdomain only? Thanks in advance!
Intermediate & Advanced SEO | | Partouter0 -
Help with a Sticky Site
Hey Everyone - I work for a company that is just getting into SEO. We have had some successes, but one project lately has got us stumped. We have been working hard, but have been unable to make an impact in Google rankings with the following site: http://stoneycreekinn.com/locations/index.cfm/DesMoines We are trying to optimize for the keyword phrase, "des moines hotel" This hotel is a branch location of a hotel chain in the Midwest. *Note we've already moved up some other branch locations for this hotel chain successfully. We've used several tools including the SEOmoz tool and seem to have higher marks than those sites that rank above us in Google surprisingly. Any idea what we're missing? Thanks!
Intermediate & Advanced SEO | | markhope0