How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema LocalBusiness for multiple store locations
Hello, I want to add LocalBusiness mark up to our store pages. I recently read this article https://moz.com/blog/structured-data-for-seo-2 One piece of advice I wanted to confirm is this section: It might seem counter-intuitive, but organization and LocalBusiness markup should only be used on the pages which are actually about your business (e.g. homepage, about page, and/or contact page). I am working on an e-commerce site that has multiple stores and a store page for each. Currently there is Organization Mark Up (with the head office contact information) on every single page. Am I right in saying that we should remove the Organization Mark Up from every page except the homepage, then add LocalBusiness mark up to each store page with the relevant contact information for each store? Many thanks in advance.
On-Page Optimization | | kbm30 -
What is the perfect way to handle multiple sitemaps index in Search Console?
Hello friends, I have this doubt for a long and i want to share it with you. In our agency many clients have a PHP template for the home page of their sites, and also have a blog with wordpress as CMS. When i am optimizing sitemaps, I have two separate files, an index of Sitemaps created with Wordpress SEO by Yoast (which inside has separate Sitemaps tags, categories, posts, pages, authors, etc.) and on the other hand the home page sitemap with the subsections. As you know the sitemap generated by "Wordpress SEO by Yoast" is dynamic as it creates the sitemap according to current site content, and is updated every time a new entry is raised or modify any URL. This makes it very practical. I can not have a unique index sitemap sitemaps nesting inside another, as it is not allowed by Google or Sitemap protocol. I read in the Google Support you can upload multiple sitemaps to Search Console but does not say anywhere on upload multiple sitemaps index, or a combination thereof. In my case, I would have to upload two separately files, the dynamically generated with wordpress and the manual created for the PHP template. In my opinion there is no problem and Google will index everything properly performing it this way, but I wanted to share it with you to see how you solve this problem and what experiences had. Thanks and best regards.
On-Page Optimization | | NachoRetta1 -
Multiple domains
I have a client, (for example Henry Jewelers) who likes to advertise their site as www.henry.com, but their main site resides on HenryJewelers.com. If www.henry.com has a permanent redirect to henryjewelers.com, are the two sites impacted negatively from a SEO standpoint?
On-Page Optimization | | BrettMackrell0 -
What's the point of my blog?
My website, www.toplinecomms.com has a reasonably good blog that gets quite good interaction and sharing. I introduced the blog at the start of 2013 because the general sentiment from all the SEO books and articles I had read was that a good blog could be invaluable to a search marketing campaign. The posts on the blog are keyword optimised and they get great shares and social engagement. However, I have noticed that the blog is stealing my services' pages' thunder! There are some keywords that I am keen for our services pages to rank for, but the blog is beating them to it! So my question is: How should I be using my blog to get my services pages to rank higher?
On-Page Optimization | | HeatherBakerTopLine0 -
SEO Location Pages - ALT Image Tag Question
Hello Guru's, I have a Hire Website whereby you can rent products online. I have created different Location pages for these which are in essence the same pages page but with different location specific urls, title tags , on page content etc etc. This helps me to rank for local search. These location pages also display 20 products per page. My question is Should I make the ALT IMAGE TEXT location specific for each of the 20 products . Example - Steam Cleaner Rental in "location" or should I only amend a few of the Atl Image Texts to be location specific. I don't want to come accross as spammy in google eyes but I also don't want to be seen as having duplicate content , images etc etc What do you think ? thanks Sarah.
On-Page Optimization | | SarahCollins0 -
Multiple domains vs single domain vs subdomains ?
I have a client that recently read an article that advised him to break up his website into various URL's that targeted specific products. It was supposed to be a solution to gain footing in an already competitive industry. So rather than company.com with various pages targeting his products, he'd end up having multiple smaller sites: companyClothing.com companyShoes.com Etc. The article stated that by structuring your website this way, you were more likely to gain ranking in Google by targeting these niche markets. I wanted to know if this article was based on any facts. Are there any benefits to creating a new website that targets a specific niche market versus as a section of pages on a main website? I then began looking into structuring each of these product areas into subdomains, but the data out there is not definitive as to how subdomains are viewed by Google and other search engines - more specifically how subdomains benefit (or not!) the primary domain. So, in general, when a business targets many products and services that cover a wide range - what is the best way to structure the delivery of this info: multiple domains, single domain with folders/categories, or subdomains? If single domain with folders/categories are not an option, how do subdomains stack up? Thanks in advance for your help/suggestions!
On-Page Optimization | | dgalassi0 -
Keyword rich domains - Which are the best domain extensions for ranking in the UK other than .com or .co.uk
Hey there, In the absence of getting a keyword rich .com or .co.uk domain am wondering how influential .info, .org, .net domains (or any others) are in comparison? Does anyone have any comparison info to share please?
On-Page Optimization | | Wallander0 -
Domain.com and www.domain.com
I recently changed the settings in Google Webmaster Tools so that domain.com and www.domain.com are the same. Several quick questions. About how long will it take for Google to list www.domain.com and stop listing domain.com? The .htaccess file uses a 301 to redirect all domain.com paths.to www.domain.com paths. Now that Google has been informed these two are the same, are the 301 rules to add www necessary? The default page is index.php. so domain.com gets 301 to www.domain.com gets 301 to www.domain.com/index.php. Is this the correct way to do things? Are there SEO consultants who will help on small projects such assist on issues like this? Best,
On-Page Optimization | | ChristopherGlaeser
Christopher0