Multiple Sitemaps
-
Hello everyone!
I am in the process of updating the sitemap of an ecommerce website and I was thinking to upload three different sitemaps for different part (general/categories and subcategories/productgroups and products) of the site in order to keep them easy to update in the future.
Am I allowed to do so? would that be a good idea?
Open to suggestion
-
Right! I think now I have the complete picture and I can crack on working on it!
Thank you very much indeed!
Best Regards
Oscar
-
If you are talking about the sitemap for the visitors on your website, if you think the newly added pages are going to be helpful to them, you can update your visitors sitemap accordingly. But the Sitemap.xml file is a supplemental indexing tool meant for the search engines to find the pages on your website easily and needs to be updated and resubmitted to search engines using webmaster tools accounts whenever new pages are added to your website.
Hope that helps.
Best,
Devanur Rafi
-
Thanks a lot guys!
I really appreciated your help, although all this information made me realize I have tons of work to do to update the sitemaps and I have to start creating new ones.
Just another question, after I create the new sitemaps I will also have to update the sitemap on the website, is that right?
-
it should be added to the end of your robots.txt and be proceeded by 'Sitemap', like:
Sitemap: http://www.exmaple.com/sitemap1.xml
Sitemap: http://www.exmaple.com/sitemap2.xml -
No problem my friend. You are most welcome. Yes, you just need to give the location of your sitemap.xml file as given below:
Sitemap: http://example.com/sitemap_location.xml Here you go for more: https://support.google.com/webmasters/answer/183669?hl=en
-
Oh I see, thank you very much for your help, I haven't got much experience dealing with sitemaps.
So in order to put them in the robots.txt I will just have to put the link in it without anything else, is that right?
-
Hi there, robots.txt file is one of the initial things that search engine spiders look at when they visit your website and a reference to the Sitemap.xml file in there will aid the search engine spider to quickly access to important URLs on your website then and there.
Best,
Devanur Rafi
-
Why should I put the sitemaps in the robots.txt?
I ve been looking around and some sites do and some don't, what's the reason for it?
-
Thanks for the response my friend. The problem without an index sitemap file is, when you have to resubmit multiple sitemap.xml files in webmaster tools account, you will have to resubmit each of them at a time. With an index sitemap file, you just need to submit the index file and it would take care of the job.
Here you go for more: https://support.google.com/webmasters/answer/71453?hl=en
Best,
Devanur Rafi
-
You don't actually need to use a sitemap index file to use multiple sitemaps. You can list and submit them separately in robots.txt file and Google Web Master Tools.
-
Yes this is fine, from Google:
Whether you list all URLs in a single Sitemap or in multiple Sitemaps (in the same directory of different directories) is simply based on what's easiest for you to maintain. We treat the URLs equally for each of these methods of organization. More info can be found here multiple sitemaps in same directory
-
Hi there, though a single Sitemap.xml file can accommodate upto 50K URLs, it is not uncommon to go for multiple Sitemap.xml files for many purposes even with few hundreds on each.
You need to come up with a total of 4 Sitemap files and one among these would be an index sitemap that lists the other 3 Sitemap.xml files with URLs.
Here you go for more: http://www.sitemaps.org/protocol.html
Best,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does changing sitemaps affect SEO
Hi all, I have a question regarding changing the size of my sitemaps. Currently I generate sitemaps in batches of 50k. A situation has come up where I need to change that size to 15k in order to be crawled by one of our licensed services. I haven't been able to find any documentation on whether or not changing the size of my sitemaps(but not the pages included in them) will affect my rankings negatively or my SEO efforts in general. If anyone has any insights or has experienced this with their site please let me know!
Technical SEO | | Jason-Reid0 -
301 Redirects Relating to Your XML Sitemap
Lets say you've got a website and it had quite a few pages that for lack of a better term were like an infomercial, 6-8 pages of slightly different topics all essentially saying the same thing. You could all but call it spam. www.site.com/page-1 www.site.com/page-2 www.site.com/page-3 www.site.com/page-4 www.site.com/page-5 www.site.com/page-6 Now you decided to consolidate all of that information into one well written page, and while the previous pages may have been a bit spammy they did indeed have SOME juice to pass through. Your new page is: www.site.com/not-spammy-page You then 301 redirect the previous 'spammy' pages to the new page. Now the question, do I immediately re-submit an updated xml sitemap to Google, which would NOT contain all of the old URL's, thus making me assume Google would miss the 301 redirect/seo juice. Or do I wait a week or two, allow Google to re-crawl the site and see the existing 301's and once they've taken notice of the changes submit an updated sitemap? Probably a stupid question I understand, but I want to ensure I'm following the best practices given the situation, thanks guys and girls!
Technical SEO | | Emory_Peterson0 -
Sitemap links
Hi, I´m running a sitemap using pro-sitemaps and I find several pages that shouldn´t be listed. How do I find how are these pages being generated? Can´t find the links the robot is following to get to those pages..
Technical SEO | | ceci27100 -
Duplicate homepage content across multiple websites
Hi, I work for a company that runs 30+ membership based websites on separate domains and across multiple markets. The homepage for each site contains a section of content that highlights the site benefits and features. While each website serves a different market/niche, this section of content is essentially the same as each website offers the same benefits and features. What is the best way to avoid duplicate content issues while still being able to show the same section of content across 30+ sites? This particular section of content isn't valuable from an SEO perspective, but the rest of the content on that page is. Any ideas or suggestions would be much appreciated. Thanks
Technical SEO | | CupidTeam0 -
Sitemaps
Hi, I have doubt using sitemaps My web page is a news we page and we have thousands of articles in every section. For example we have an area that is called technology We have articles since 1999!! So the question is how can Make googl robot index them? Months ago when you enter the section technology we used to have a paginator without limits, but we notice that this query consume a lot of CPU per user every time was clicked. So we decide to limit to 10 pages with 1 records. Now it works great BUT I can see in google webmaster tools that our index decreased dramatically The answer is very easy, the bot doesn't have a way to get older technoly news articles because we limit he query to 150 records total Well, the Questin is how can I fix this? Options: 1) leave the query without limits 2) create a new button " all tech news" with a different query without a limit but paginated with (for example) 200 records each page 3) Create a sitemap that contain all the tech articles Any idea? Really thanks.
Technical SEO | | informatica8100 -
One H1 tag Dead Long Live multiple H1 tags?
Good afternoon from 9 degrees C mostly cloudy Wetherby UK, Ive been holding on to the mantra of one h1 tag per page but a developer has challenged me on this by stating you can have multiple h1 tags on the condition the page is HTML 5 & each h1 tag is within its own section or article tag. So the question is do i need to change my tune? Thanks in advance, David
Technical SEO | | Nightwing0 -
Duplicate XML sitemaps - 404 or leave alone?
We switched over from our standard XML sitemap to a sitemap index. Our old sitemap was called sitemap.xml and the new one is sitemapindex.xml. In Webmaster Tools it still shows the old sitemap.xml as valid. Also when you land on our sitemap.xml it will display the sitemap index, when really the index lives on sitemapindex.xml. The reason you can see the sitemap on both URLs is because this is set from the sitemap plugin. So the question is, should we change the plugin setting to let the old sitemap.xml 404, or should we allow the new sitemap index to be accessed on both URLs?
Technical SEO | | Hakkasan0 -
Adjust the priority field under the XML sitemap option
For those familiar with this in Drupal - is this worth doing? It seems to be a setting that affects the priority of a URL compared to others on the site. It's set to a default of 0.5 but you can increase up to 1.0 I think. Anyone know about this? thanks
Technical SEO | | inhouseninja0