How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
More Singular KW Targeted Landing Pages vs. Less Multiple KW Targeted Landing Pages
So my question is... I have a adopted a site which currently ranks quite well for some industry competitive keywords with a number of poor quality landing pages which specifically target a singular keyword. I am wondering if its worth merging some of these pages together into one authoritative, better quality landing page targeting multiple keywords (as the intent for some of these keywords are largely the same). What i don't want to do is jeopardise the existing rankings in doing so. The alternative option would just be to improve the content on the existing landing pages without merging. What are peoples thoughts on this? Are there any positive case studies out there where merging has had a positive effect? Any help would be great. Regards,
On-Page Optimization | | NickG-1231 -
Deciding on domain
Hi, I have a company website that I have access to and it's bcannon.remax-mississippi.com. It allows you to choose a domain and have it 301 redirected so I changed it to Oxfordmissrealestate.com. I noticed that the page and domain authority for bcannon.remax-mississippi.com was pretty high so I took off the Oxfordmissrealestate.com domain and just left it as this subdomain. My question is, would you leave it how it is as the subdomain or change it? Thanks!
On-Page Optimization | | Veebs0 -
How to Optimize Multiple Listing Pages
Hello Webmasters, How can I optimize a site having a listings which creates various multiple pages? e.g: Pages like below: http://moz.com/blog?page=2 http://moz.com/blog?page=3 etc I want to optimize meta tags of these pages. If i put common title and description. My moz analysis showing duplicate meta tags and duplicate description issues. Please guide me to optimize these type of pages.
On-Page Optimization | | wmsindia0 -
Multiple Years Domain Registration
Is there benefit to registering a domain name for multiple years from a SEO standpoint?
On-Page Optimization | | lbohen0 -
What's the best practice for handling duplicate content of product descriptions with a drop-shipper?
We write our own product descriptions for merchandise we sell on our website. However, we also work with drop-shippers, and some of them simply take our content and post it on their site (same photos, exact ad copy, etc...). I'm concerned that we'll loose the value of our content because Google will consider it duplicated. We don't want the value of our content undermined... What's the best practice for avoiding any problems with Google? Thanks, Adam
On-Page Optimization | | Adam-Perlman0 -
Video Location
Is there any benefits or disadvantages to embedding product videos directly on the site vs. having them pop out in a separate window?
On-Page Optimization | | ClaytonKendall0 -
Keyword vs Brand Domain Name
Hi guys, I'm about to launch a new site for a friend who is an accountant in a specialist field. He's already bought 2 domains: **www.[keyword]-accountants.net ** **www.[brand]accountants.com ** We have made the decision to use the brand domain to host the site but what can we do with the keyword domain as exact match domains still seem to be ranking well in the serps? e.g. build keyword links to the keyword domain (heavily seo'd content) and build brand links to the brand domain (conversion-optimised content) then after while 301 the keyword domain? Any new suggestions will be gratefully received!
On-Page Optimization | | Tman30 -
Cross-domain canonical
HI, We've got a German e-commerce site on an .at domain and would like to have a copy on a .de domain as we expect higher conversions for German users there. The idea now would be to make use of the cross-domain canonical tag, create a "duplicate" on the .de domain and add a canonical tag on all sites and refer to the original content on the .at domain. That would mean the .de won't rank, but German users could see the .de domain, Austrian users the .at domain in the address bar and everybody could feel "at home" ... that's the theory. What do you guys think? Valid strategy?
On-Page Optimization | | gmellak0