Multiple Sitemaps
-
Hello everyone!
I am in the process of updating the sitemap of an ecommerce website and I was thinking to upload three different sitemaps for different part (general/categories and subcategories/productgroups and products) of the site in order to keep them easy to update in the future.
Am I allowed to do so? would that be a good idea?
Open to suggestion
-
Right! I think now I have the complete picture and I can crack on working on it!
Thank you very much indeed!
Best Regards
Oscar
-
If you are talking about the sitemap for the visitors on your website, if you think the newly added pages are going to be helpful to them, you can update your visitors sitemap accordingly. But the Sitemap.xml file is a supplemental indexing tool meant for the search engines to find the pages on your website easily and needs to be updated and resubmitted to search engines using webmaster tools accounts whenever new pages are added to your website.
Hope that helps.
Best,
Devanur Rafi
-
Thanks a lot guys!
I really appreciated your help, although all this information made me realize I have tons of work to do to update the sitemaps and I have to start creating new ones.
Just another question, after I create the new sitemaps I will also have to update the sitemap on the website, is that right?
-
it should be added to the end of your robots.txt and be proceeded by 'Sitemap', like:
Sitemap: http://www.exmaple.com/sitemap1.xml
Sitemap: http://www.exmaple.com/sitemap2.xml -
No problem my friend. You are most welcome. Yes, you just need to give the location of your sitemap.xml file as given below:
Sitemap: http://example.com/sitemap_location.xml Here you go for more: https://support.google.com/webmasters/answer/183669?hl=en
-
Oh I see, thank you very much for your help, I haven't got much experience dealing with sitemaps.
So in order to put them in the robots.txt I will just have to put the link in it without anything else, is that right?
-
Hi there, robots.txt file is one of the initial things that search engine spiders look at when they visit your website and a reference to the Sitemap.xml file in there will aid the search engine spider to quickly access to important URLs on your website then and there.
Best,
Devanur Rafi
-
Why should I put the sitemaps in the robots.txt?
I ve been looking around and some sites do and some don't, what's the reason for it?
-
Thanks for the response my friend. The problem without an index sitemap file is, when you have to resubmit multiple sitemap.xml files in webmaster tools account, you will have to resubmit each of them at a time. With an index sitemap file, you just need to submit the index file and it would take care of the job.
Here you go for more: https://support.google.com/webmasters/answer/71453?hl=en
Best,
Devanur Rafi
-
You don't actually need to use a sitemap index file to use multiple sitemaps. You can list and submit them separately in robots.txt file and Google Web Master Tools.
-
Yes this is fine, from Google:
Whether you list all URLs in a single Sitemap or in multiple Sitemaps (in the same directory of different directories) is simply based on what's easiest for you to maintain. We treat the URLs equally for each of these methods of organization. More info can be found here multiple sitemaps in same directory
-
Hi there, though a single Sitemap.xml file can accommodate upto 50K URLs, it is not uncommon to go for multiple Sitemap.xml files for many purposes even with few hundreds on each.
You need to come up with a total of 4 Sitemap files and one among these would be an index sitemap that lists the other 3 Sitemap.xml files with URLs.
Here you go for more: http://www.sitemaps.org/protocol.html
Best,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap.xml Site multilang
HI all, I have some questions about multilang sitemap.xml. So, we use the same domain subdirectories with gTLDs example.com/pt-br/
Technical SEO | | mobic
example.com/us/
example.com/es/ How should I do the sitemap.xml in this case? I thought of three alternatives: Should I do a sitemap_index.xml to each lang and make categories for these sitemaps? Examples:
http://www.example.com/pt-br/sitemap_index.xml
http://www.example.com/en/sitemap_index.xml
http://www.example.com/es/sitemap_index.xml Should I do only one sitemap_index.xml covering all categories of all languages ? Examples:
http://www.example.com/sitemap_index.xml
http://www.example.com/pt-br/sitemap_categorias_1.xml
http://www.example.com/es/sitemap_categorias_1.xml
http://www.example.com/us/sitemap_categorias_1.xml Should I do a sitemap setting all multilang? <url><loc>http://www.example.com/us/</loc>
<xhtml:link <br="">rel="alternate"
hreflang="es"
href="http://www.example.com/pt-br/"
/>
<xhtml:link <br="">rel="alternate"
hreflang="us"
href="http://www.example.com/us/"
/>
<xhtml:link <br="">rel="alternate"
hreflang="pt-br"
href="http://www.example.com/pt-br/"
/></xhtml:link></xhtml:link></xhtml:link></url> Thanks for any advice.0 -
Type of sitemap
I have a client with a large sitemap in html for his web shop. I am wondering though if i would be better to have a xml sitemap for Google. Is there any advantage in type of sitemap?
Technical SEO | | auke18100 -
Duplicate Titles and Sitemap rel=alternate
Hello, Does anyone know why I still have duplicate titles after crawling with moz (also google webmasters shows the same) even after I implemented (since 1 week or 2) a new sitemap with rel=alternate attribute for languges? In fact, the duplicates should be in the titles like http://socialengagement.it/su-di-me and http://socialengagement.it/en/su-di-me. The sitemap is on socialengagement.it/sitemap.xml (please note formatting somehow does not show correctly, you should see the source code to double check if its done properly. Was made by hand by me). Thanks for help! Eugenio
Technical SEO | | socialengaged0 -
Should I Edit Sitemap Before Submitting to GWMT?
I use the XML sitemap generator at http://www.auditmypc.com/xml-sitemap.asp and use the filter that forces the tool to respect robots.txt exclusions. This generator allows me to review the entire sitemap before downloading it. Depending on the site, I often see all kinds of non-content files still listed on the sitemap. My question is, should I be editing the sitemap to remove every file listed except ones I really want spidered, or just ignore them and let the Google spiderbot figure it all out after I upload-submit the XML?
Technical SEO | | DonB0 -
How do you handle Wordpress sitemaps within your site?
I have a regular site map on my site and I also have a Wordpress site installed within it that we use for blog/news content. I currently have an auto-sitemap generator installed in Wordpress which automatically updates the sitemap and submits it to the search engines each time the blog is updated. The question I have (which I think I know the answer to but I just want to confirm) is do I have to include all of the articles within the blog in the main site's sitemap despite the Wordpress sitemap having them in there already? If I do include the articles in the main website's sitemap, they would also be in the Wordpress sitemap as well, which is redundant. Redundancy is not good, so I just want to make sure.
Technical SEO | | iresqkeith0 -
Is pointing multiple domains to a single website beneficial for SEO or not?
A client has purchased many domains with keywords in each. They want to have us point each domain to their site for better SEO. Is this a good or bad thing to do?
Technical SEO | | thinkcreativegroup0 -
How can I best find out which URLs from large sitemaps aren't indexed?
I have about a dozen sitemaps with a total of just over 300,000 urls in them. These have been carefully created to only select the content that I feel is above a certain threshold. However, Google says they have only indexed 230,000 of these urls. Now I'm wondering, how can I best go about working out which URLs they haven't indexed? No errors are showing in WMT related to these pages. I can obviously manually start hitting it, but surely there's a better way?
Technical SEO | | rango0 -
Sitemap.xml - autogenerated by CMS is full of crud
Hi all, hope you can help. the Magento ecommerce system I'm working with autogenerates sitemap.xml - it's well formed with priority and frequency parameters. However, it has generated lots of URLs that are pointing to broken pages returning fatal erros, duplicate URLs (not canonicals), 404s etc I'm thinking of hand creating sitemap.xml - the site has around 50 main pages including products and categories, and I can get the main page URLs listed by screaming frog or xenu. Then I'll have to get into the hand editing the crud pages with noindex, and useful duplicates with canonicals. Is this the way to go or is there another solution thanks in advance for any advice
Technical SEO | | k3nn3dy30