Multiple Sitemaps
-
Hello everyone!
I am in the process of updating the sitemap of an ecommerce website and I was thinking to upload three different sitemaps for different part (general/categories and subcategories/productgroups and products) of the site in order to keep them easy to update in the future.
Am I allowed to do so? would that be a good idea?
Open to suggestion
-
Right! I think now I have the complete picture and I can crack on working on it!
Thank you very much indeed!
Best Regards
Oscar
-
If you are talking about the sitemap for the visitors on your website, if you think the newly added pages are going to be helpful to them, you can update your visitors sitemap accordingly. But the Sitemap.xml file is a supplemental indexing tool meant for the search engines to find the pages on your website easily and needs to be updated and resubmitted to search engines using webmaster tools accounts whenever new pages are added to your website.
Hope that helps.
Best,
Devanur Rafi
-
Thanks a lot guys!
I really appreciated your help, although all this information made me realize I have tons of work to do to update the sitemaps and I have to start creating new ones.
Just another question, after I create the new sitemaps I will also have to update the sitemap on the website, is that right?
-
it should be added to the end of your robots.txt and be proceeded by 'Sitemap', like:
Sitemap: http://www.exmaple.com/sitemap1.xml
Sitemap: http://www.exmaple.com/sitemap2.xml -
No problem my friend. You are most welcome. Yes, you just need to give the location of your sitemap.xml file as given below:
Sitemap: http://example.com/sitemap_location.xml Here you go for more: https://support.google.com/webmasters/answer/183669?hl=en
-
Oh I see, thank you very much for your help, I haven't got much experience dealing with sitemaps.
So in order to put them in the robots.txt I will just have to put the link in it without anything else, is that right?
-
Hi there, robots.txt file is one of the initial things that search engine spiders look at when they visit your website and a reference to the Sitemap.xml file in there will aid the search engine spider to quickly access to important URLs on your website then and there.
Best,
Devanur Rafi
-
Why should I put the sitemaps in the robots.txt?
I ve been looking around and some sites do and some don't, what's the reason for it?
-
Thanks for the response my friend. The problem without an index sitemap file is, when you have to resubmit multiple sitemap.xml files in webmaster tools account, you will have to resubmit each of them at a time. With an index sitemap file, you just need to submit the index file and it would take care of the job.
Here you go for more: https://support.google.com/webmasters/answer/71453?hl=en
Best,
Devanur Rafi
-
You don't actually need to use a sitemap index file to use multiple sitemaps. You can list and submit them separately in robots.txt file and Google Web Master Tools.
-
Yes this is fine, from Google:
Whether you list all URLs in a single Sitemap or in multiple Sitemaps (in the same directory of different directories) is simply based on what's easiest for you to maintain. We treat the URLs equally for each of these methods of organization. More info can be found here multiple sitemaps in same directory
-
Hi there, though a single Sitemap.xml file can accommodate upto 50K URLs, it is not uncommon to go for multiple Sitemap.xml files for many purposes even with few hundreds on each.
You need to come up with a total of 4 Sitemap files and one among these would be an index sitemap that lists the other 3 Sitemap.xml files with URLs.
Here you go for more: http://www.sitemaps.org/protocol.html
Best,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am trying to generate GEO meta tag for my website where on one page there are multiple locations My question is, Can I add GEO tagging for every address?
Am I restricted to 1 geo tag per page or can i add multiple geo tags ?
Technical SEO | | lina_digital0 -
Whats the best tool for a Sitemap creation?
Hi guys i like to know whats the best tool to create diferent types of Sitemap´s (images, videos, normals). I dont care if is paid.
Technical SEO | | faraujoj0 -
Affects of multiple subdomains on homebrew CDN for images
We're creating our own CDN such that instead of serving images from http://mydomain.com/images/shoe.jpg It will appear at all of the following subdomains: http://cdn1.mydomain.com/images/shoe.jpg http://cdn2.mydomain.com/images/shoe.jpg http://cdn3.mydomain.com/images/shoe.jpg http://cdn4.mydomain.com/images/shoe.jpg Image tags on our pages will randomly choose any subdomain for the src. The thought was this will make page loading faster by paralellizing requests across many cookie-less domains. How does this affect : -Ranking of images on Google image search. -Ranking of pages they appear on -Domain authority (images are linked to heavily in our social media efforts, so we will 301 redirect image urls to cdn1.mydomain.com) Should we disallow all but one CDN domain in robots.txt? Will robots.txt on an image only subdomain even be retrieved? Should we just use 1 CDN subdomain instead?
Technical SEO | | cat5com0 -
Merging multiple sites and contacting linking domains
This is strictly academic but I am having a friendly debate and I am hoping you guys could help me. If I decided that I wanted to merge several websites into a single new URL doing everything I am supposed to (page to page 301 redirects, etc), will I still need to reach out to those important websites that link to my different sites to have them change the links and anchor text to point to the new site? I know that 90% of the link juice is supposed to transfer and that you are SUPPOSED to contact linking domains, but is it really worth it, especially if there are literally hundreds of sites to contact?
Technical SEO | | Mike_Davis0 -
Do Seomozers recommend sitemaps.xml or not. I'm thoroughly confused now. The more I read, the more conflicted I get
I realize I'm probably opening a can of worms, but here we go. Do you or do you not add a sitemap.xml to a clients site?
Technical SEO | | catherine-2793880 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0 -
Multiple title and description change is considered spam?
I'm changing title and description several times throught 1-2 months... will Google penalize me? I'm worried that testing several title and description ideas would affect the site rankings on google. Anyone had a problem with that... or can i continue testing as many titles and descriptions as i want?
Technical SEO | | mosaicpro0 -
Duplicate content across multiple domains
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain. My question relates to the removal of these pages. There are thousands of these duplicate pages. I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen? Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site. I suppose another option would be to include no cache meta tag for these pages. Any thoughts or comments would be appreciated.
Technical SEO | | jmsobe0