How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema LocalBusiness for multiple store locations
Hello, I want to add LocalBusiness mark up to our store pages. I recently read this article https://moz.com/blog/structured-data-for-seo-2 One piece of advice I wanted to confirm is this section: It might seem counter-intuitive, but organization and LocalBusiness markup should only be used on the pages which are actually about your business (e.g. homepage, about page, and/or contact page). I am working on an e-commerce site that has multiple stores and a store page for each. Currently there is Organization Mark Up (with the head office contact information) on every single page. Am I right in saying that we should remove the Organization Mark Up from every page except the homepage, then add LocalBusiness mark up to each store page with the relevant contact information for each store? Many thanks in advance.
On-Page Optimization | | kbm30 -
How do you handle URLs with slashes?
I asked this question before, but with a different scenario. I upgraded my plan to a more advanced cart and all of my URLs changed about 1.5 years ago. I knew nothing about redirects and such, so none of that was done. Basically, let's say my site was: http://www.abc.com, but when people actually visit my site, they are directed to https://www.abc.com/. I have asked my host about redirecting and she that it is not possible. In the past, the link shared has been just www.abc.com . Will this hurt my ranking? My second question is ...let's say I have a link http://www.abc.com/blog , but now, the link is http://www.abc.com/blog/ . Will I be affected, since all my old links omit the slash?
On-Page Optimization | | tiffany11030 -
Impact of multiple links on the same page to the same url (different anchor text) ?
Hi, On our category pages, for every product we have several links pointing to the product : on the image, on the product name, on the short description, on "read more", and a javascript onclick on the entire div. Could this have a negative impact for link juice distribution, or is it counted as only 1 link with the first anchor text found on the page ? Thanks,
On-Page Optimization | | Strelok0 -
Can a country level domain perform well in international SEO if all the targeted keywords are related to that country domain?
Hi fellow Mozers, I am doing international SEO on Google US, UK, UAE and Saudi Arabia. All my targeted keywords have my country name in it, for example, export companies in France, best import companies in France and so on. Finally, my website has a country level domain i.e. www.xyz.co.fr _Hence, my questions are: _ 1. Is it good have a country level domain in this case or should I go TLDs? 2. Should my Google Plus page be a local business page or company page from SEO perspective? I have more than 10000 users who have +1 my website. _Thanks in advance. _
On-Page Optimization | | Abhi81870 -
Two different domains with the same content
Hi all, I just figured out that my client has two different domains with the same content, so the site will be penalized by Google: www.piensapiensa.es www.piensapiensa.com Should I do a 301 redirection from one domain to the other one using the htaccess file as usual chosing one of them as canonical? redirect 301 piensapiensa.es www.piensapiensa.com? Thanks.
On-Page Optimization | | juanmiguelcr0 -
How should I handle author attribution for ghostwritten content?
I've been using Crowdcontent for article production, and always feel like I'm potentially missing out on some authority or social proof with visitors (and maybe Google?) by not attributing an author (Crowdcontent doesn't give you the name of the author, otherwise I would just use their name). Would I be doing myself any favors by attributing myself as the author and pointing it back to my Google+ profile? Thanks in advance for any guidance!
On-Page Optimization | | spking0 -
What is the best way to format an xml sitemap?
I am wondering if the urls should be in alphabetical order or if they should be set out in a way that reflects the sites hierarchy? Thanks.
On-Page Optimization | | Webat0 -
How would you handle network header links?
Some companies have a lot of sites covering various topics, for example, http://ninemsn.com.au/. Each category also have dropdown menus where there are more links, taking their pages to well over 100 links. Should these headers be implemented in javascript? Is there a list of best practices somewhere when dealing with a lot of network sites? I'd prefer to reduce the number of links, but sometimes company policies don't allow this. Any suggestions or tips would be helpful.
On-Page Optimization | | bigpond0