Using one robots.txt for two websites
-
I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this:
User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap
Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?
-
Hi @gpainter,
Thanks for your help. I can't see anything specific in that link that says you can't have two sitemaps in one robots.txt. Where it mentions the sitemap it does say "You can specify multiple sitemap fields", although I'm not sure whether this means having multiple sitemap URLs under one mention of 'sitemap'?
-
@ciehmoz Hey I've replied to the other thread too.
The best case here will be to utilize different robots.txt files for both the websites.
You could've used the same robots.txt file only if the other site was on the same subdomain.
Don't forget to include the corresponding sitemaps to the new robots.txt file, hope this works out, cheers.
-
Hey @ciehmoz
Just replied to your other thread, you will need one robot.txt per site. Referring to two sitemaps in one robots.txt will confuse Google.
Info here - https://developers.google.com/search/docs/advanced/robots/robots_txt
Good Luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do I optimize a website for SEO for a client that is using a subdirectory as a seperate website?
We launched a subdirectory site about two months ago for our client. What's happening is searches for the topic covered by the subdirectory are yielding search results for the old site and not the new site. We'd like to change this. Are there best practices for the subdirectory site Specifically we're looking for things we can do using sitemapping and Webmaster tools. Are there other technical things we can do? Thanks you.
Technical SEO | | IVSeoTeam120 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Google not using redirect
We have a GEO-IP redirect in place for our domain, so that users are pointed to the subfolder relevant for their region, e.g: Visit example.com from the UK and you will be redirected to example.com/uk This works fine when you manually type the domain into your browser, however if you search for the site and come to example.com, you end up at example.com I didn't think this was too much of an issue but our subfolders /uk and /au are not getting ranked at all in Google, even for branded keywords. I'm wondering if the fact that Google isn't picking up the redirect means that the pages aren't being indexed properly? Conversely our US region (example.com/us) is being ranked well. Has anyone encountered a similar issue?
Technical SEO | | ahyde0 -
Why is my website embedded inside this Iranian one?
I just did a search for something and my own website tured up at 1, but this other wbsite turned up at 2 with what looked like my content http://sport.aliexirs.ir/ASTM-A-Plate-Grades-HIC.html. so I clicked on it and it is my website running inside a frame. so what is this all about? Do I need to worry? And what should I do?
Technical SEO | | Zippy-Bungle0 -
Site blocked by robots.txt and 301 redirected still in SERPs
I have a vanity URL domain that 301 redirects to my main site. That domain does have a robots.txt to disallow the entire site as well. However, for a branded enough search that vanity domain still shows up in SERPs and has the new Google message of: A description for this result is not available because of this site's robots.txt I get why the message is there - that's not my , my question is shouldn't a 301 redirect trump this domain showing in SERPs, ever? Client isn't happy about it showing at all. How can I get the vanity domain out of the SERPs? THANKS in advance!
Technical SEO | | VMLYRDiscoverability0 -
How do I remove Links to my website???
Hi Guys, Please can anyone help!! Can anyone tell me how on earth I can remove links to my website? My website has been hit by the new penguin update and the company that was doing my SEO seems to have built a lot of spammy links!! How can I remove these links??? Please can anyone help Thanks Gareth
Technical SEO | | GAZ090 -
How long does it take for traffic to bounce back from and accidental robots.txt disallow of root?
We accidentally uploaded a robots.txt disallow root for all agents last Tuesday and did not catch the error until yesterday.. so 6 days total of exposure. Organic traffic is down 20%. Google has since indexed the correct version of the robots.txt file. However, we're still seeing awful titles/descriptions in the SERPs and traffic is not coming back. GWT shows that not many pages were actually removed from the index but we're still seeing drastic rankings decreases. Anyone been through this? Any sort of timeline for a recovery? Much appreciated!
Technical SEO | | bheard0 -
Robots.txt
Hi everyone, I just want to check something. If you have this entered into your robots.txt file: User-agent: *
Technical SEO | | PeterM22
Disallow: /fred/ This wouldn't block /fred-review/ from being crawled would it? Thanks0