Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Adding multi-language sitemaps to robots.txt
-
I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language.
The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as:
- /sitemap/uk/sitemap.xml
- /sitemap/de/sitemap.xml
I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language:
- /sitemap/uk-sitemap.xml
- /sitemap/de-sitemap.xml
What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?
-
Adding the following lines to the bottom of your robots.txt should do it:
Sitemap: http://www.example.com/sitemap/uk/sitemap.xml
Sitemap: http://www.example.com/sitemap/de/sitemap.xml
If you wanted to update the file names to be different it wouldn't hurt, but I don't think you would have any problems with how they are currently set up. If you have submitted them to WMT and they are being picked up ok I think you are fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
CcTLD + Subdirectory for languages
Hey, a client has as .de domain with subdirectories for different languages, so domain.de/de, domain.de/en, domain.de/fr etc. hreflang Tags are implemented, so each subdirectory of each language references to the other languages, so for domain.de/en it is: My question is about the combination of ccTLD + language subdirectory. Do you think this is problematic for Google and should be replaced with .com + language subdirectory? We have lots a high quality domains (from countries with corresponding languages) linking to .de/de and .de/en, some links on .de/fr & .de/es and 0 links pointing to .de/cn. Thanks in advance!
Technical SEO | | Julisn
Julian0 -
Robot.txt : How to block a specific file type in several subdirectories ?
Hello everyone ! I need help setting up a robot.txt. I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site. Block files of a specific file type (for example, .gif) | Disallow: /*.gif$ 2 questions : Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ? Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$ Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files. Let's say I want to block pdf files in all these 3 directories /fileadmin/directory1 /fileadmin/directory1/sub1 /fileadmin/directory1/sub1/pdf Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple : Disallow: /fileadmin/directory1*/ Many thanks in advance for any insight you may have.
Technical SEO | | LabeliumUSA0 -
Robots txt. in page with 301 redirect
We currently have a a series of help pages that we would like to disallow from our robots txt. The thing is that these help pages are located in our old website, which now has a 301 redirect to current site. Which is the proper way to go around? 1- Add the pages we want to disallow to the robots.txt of the new website? 2- Break the redirect momentarily and add the pages to the robots.txt of the old one? Thanks
Technical SEO | | Kilgray0 -
Sitelinks Issue - Different Languages
Hey folks, We run different ccTLD's for revolveclothing.com (revolveclothing.es, revolveclothing.com.br, etc. etc.) and they all have their own WMT/Google Console with their own href lang tags etc. The problem is this. https://www.google.fr/#q=revolve+clothing When you look at the sitelinks, you'll see that one of them (sales page) happens to be in Portuguese on the French site. Can anyone investigate and see why?
Technical SEO | | ggpaul5620 -
Are robots.txt wildcards still valid? If so, what is the proper syntax for setting this up?
I've got several URL's that I need to disallow in my robots.txt file. For example, I've got several documents that I don't want indexed and filters that are getting flagged as duplicate content. Rather than typing in thousands of URL's I was hoping that wildcards were still valid.
Technical SEO | | mkhGT0 -
Can you have a /sitemap.xml and /sitemap.html on the same site?
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?
Technical SEO | | PioneerServices0 -
Robots.txt File Redirects to Home Page
I've been doing some site analysis for a new SEO client and it has been brought to my attention that their robots.txt file redirects to their homepage. I was wondering: Is there a benfit to setup your robots.txt file to do this? Will this effect how their site will get indexed? Thanks for your response! Kyle Site URL: http://www.radisphere.net/
Technical SEO | | kchandler0