Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple H1 tags on Squarespace blog page?
Hi All, I use Squarespace and while running my site (https://www.growmassagebusiness.com) through programs am seeing that my blog posts are being seen as one page with multiple H1 tags. I read through the SS help desk and found back in 2015 someone wrote that it's not a bit deal b/c of HTML5 and that the search engines will read each blog post as a sub-page. I'm not so sure about that and wondering what the experts think? If that is screwy then I'm considering possibly making each blog post it's own page rather than using their blog posting format.
On-Page Optimization | | rajam0 -
URL Domain Used in Meta Description
Today I was asked if using a domain url in your meta description can have a negative impact on your website. This description includes a list of the homepage url, but directs visitors to a different internal page of the website. My concern fell with directing visitors to a different page of the site, but promoting the homepage in the description/snippet. With Penguin 2.1 release on the 4th, I'm very cautious of my links/urls. What are your thoughts behind this? What are the possible, if any negative impacts this could have on a site? This URL does have a brand name as so the Title.
On-Page Optimization | | flcity150 -
How to exclude URL filter searches in robots.txt
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505 How can I exclude all of these filters in the robots.txt? I think it'll be: Disallow: /*?color=$ Is that the correct syntax with the $ sign in it? Thanks!
On-Page Optimization | | neenor0 -
Solve duplicate content issues by using robots.txt
Hi, I have a primary website and beside that I also have some secondary websites with have same contents with primary website. This lead to duplicate content errors. Because of having many URL duplicate contents, so I want to use the robots.txt file to prevent google index the secondary websites to fix the duplicate content issue. Is it ok? Thank for any help!
On-Page Optimization | | JohnHuynh0 -
Right way to block google robots from ppc landing pages
What is the right way to completely block seo robots from my adword landing pages? Robots.txt does not work really good for that, as far I know. Adding metatags noindex nofollow on the other side will block adwords robot as well. right? Thank you very much, Serge
On-Page Optimization | | Kotkov0 -
Is it ok to use the H1 tag for bullet points?
Our search results page doesn't have a typical H1 tag because adding a true header would take up space unnecessarily. Therefore, we've assigned the h1 tag to be the breadcrumb. As filters are applied, the breadcrumb grows to include these filters. This breadcrumb is coded as bullet points, even though they're not the typical style of bullet points. Here's a screenshot: http://screencast.com/t/AjGC9iAYR3 For example, the breadcrumb: Home >> NYC Social Media Classes >> Adult >> Manhattan is currently coded as: | |
On-Page Optimization | | mevseo
| | * class="first"><a <span="">href</a><a <span="">="</a>/">Home |
| | * <a <span="">href</a><a <span="">="</a>/nyc/classes/social-media/age-adults/neighborhood-manhattan" class="Selected">Search results |
| | |
| | |
| | id="cat_social-media" type="checkbox" checked onclick="setCategory('social-media')" /> |
| | # style="font-size: 12px; display: inline;">NYC Social Media Classes |
| | <label <span="">for</label>="cat_social-media"> |
| | |
| | |
| | |
| | <nobr>id="age_adults" type="checkbox" checked onclick="setAge('adults')" /><label <span="">for</label>="age_adults">Adults</nobr> |
| | |
| | |
| | <nobr>id="nbhd_manhattan" type="checkbox" checked onclick="setNeighborhood('manhattan')" /><label <span="">for</label>="nbhd_manhattan">Manhattan</nobr> |
| | |
| | | Right now that H1 tag just relates to 'NYC Social media classes', but we'd like to expand it to include both 'Manhattan' & 'Adults' - would that be ok? And if so, would it be better to put the tag before and after the tag?0 -
Transferring authority from one domain to another
My dilemma For example: If I have a website ranking at number 11 for (Keyword) and there is a site named www.(Keyword).com ranking at number 12 for (Keyword), if I were to buy this site and redirect to my own site, would this be at all beneficial? Any advice would be much appreciated!
On-Page Optimization | | CMoore850 -
How long is too long for domain URL length?
I noticed one of the negatively correlated ranking factors was length of URL. I'm building a page from scratch, we are trying to rank for 'Minneapolis Fitness' and 'Minneapolis Massage'. Is www.minnnepolismassageandfitness.com just ridiculously long? Or does the exact match outweigh the penalty for URL length?
On-Page Optimization | | JesseCWalker2