Sitemaps: HTML and/or XML?
-
Can someone explain sitemaps, and if you need html and/or xml?
I have a site with a few html sitemaps, one for products, one for categories. I have another site with just one xml sitemap for my entire site (which has massive pages, 600k+).
Should I be dividing the site with massive pages into html sitemaps like my other site?
-
If you have got a large website with 100's or 1000's of pages then you can prioritise which pages Google should see first in your XML sitemap. Your HTML should sit in the footer of your website and is important to have because it should increase the speed at which Google sees all your pages on the website. I always recommend having both XML and HTML
-
You mention XML sitemaps. They need to have less than 50K links in each sitemap and less than 50MB in size.
What you do is setup your main XML sitemap and then have it contain all the URLs to your sitemaps with up to 50K urls each. BFYO has a great article on this http://www.blindfiveyearold.com/optimize-your-sitemap-index
Main support doc on sitemaps
https://support.google.com/webmasters/answer/183668?hl=en&ref_topic=8476
Reference for Index sitemap
https://support.google.com/webmasters/answer/71453
as Moosa mentioned, the XML really helps Google find all your important links and crawl the site. You need to have one setup and submit to Google Webmaster Tools. Note that if you have an index sitemap pointing to others, you can just submit the index and Google can find the rest.
As far as an HTML sitemap, that is an HTML page that users can browse to find your pages. It also helps the bots. You can have an HTML sitemap, but I would limit it to your main pages and category pages that then can lead to all of your product pages etc. I would not bother with an extensive HTML sitemap to all products on your website when your paginated category pages do this and act as an extension of your main HTML sitemap.
-
XML sitemap helps Google while crawling the site, whereas HTML sitemaps are usually used to help the visitors to have a better and easier site experience.
In my opinion having a XML sitemap is great as it will help Google while crawling and indexing the site in to search engine but there is no technical use of HTML sitemap. If you think that your visitors need one, than go for it but otherwise having XML sitemap for a website is enough!
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Only a fraction of the sitemap get indexed
I have a large international website. The content is subdivided in 80 countries, with largely the same content all in English. The URL structure is: https://www.baumewatches.com/XX/page (where XX is the country code)
Intermediate & Advanced SEO | | Lvet
Language annotations hreflang seem to be set up properly In the Google Search Console I registered: https://www.baumewatches.com the 80 instances of https://www.baumewatches.com/XX in order to geo target the directories for each country I have declared a single global sitemap for https://www.baumewatches.com (https://www.baumewatches.com/sitemap_index.xml structured in a hierarchical way) The problem is that the site has been online already for more than 8 months and only 15% of the sitemap URLs have been indexed, with no signs of new indexations in the last 3 months. I cannot think about a solution for this.0 -
Sitemap indexing
Hi everyone, Here's a duplicate content challenge I'm facing: Let's assume that we sell brown, blue, white and black 'Nike Shoes model 2017'. Because of technical reasons, we really need four urls to properly show these variations on our website. We find substantial search volume on 'Nike Shoes model 2017', but none on any of the color variants. Would it be theoretically possible to show page A, B, C and D on the website and: Give each page a canonical to page X, which is the 'default' page that we want to rank in Google (a product page that has a color selector) but is not directly linked from the site Mention page X in the sitemap.xml. (And not A, B, C or D). So the 'clean' urls get indexed and the color variations do not? In other words: Is it possible to rank a page that is only discovered via sitemap and canonicals?
Intermediate & Advanced SEO | | Adriaan.Multiply1 -
Content Publishing Volume/Timing
I am working with a company that has a bi-monthly print magazine that has several years' worth of back issues. We're working on building a digital platform, and the majority of articles from the print mag - tips, how-tos, reviews, recipes, interviews, etc - will be published online. Much of the content is not date-sensitive except for the occasional news article. Some content is semi-date-sensitive, such as articles focusing on seasonality (e.g. winter activities vs. summer activities). My concern is whether, once we prepare to go live, we should ensure that ALL historical content is published at once, and if so, whether back-dates should be applied to each content piece (even if dating isn't relevant), or whether we should have a strategy in place in terms of creating a publishing schedule and releasing content over time - albeit content that is older but isn't necessarily time-sensitive (e.g. a drink recipe). Going forward, all newly-created content will be published around the print issue release. Are there pitfalls I should avoid in terms of pushing out so much back content at once?
Intermediate & Advanced SEO | | andrewkissel0 -
Best Sitemap for Large Website
i have more than 3500 pages on my website. Please let me know the best sitemap plugin for my website.
Intermediate & Advanced SEO | | Michael.Leonard1 -
2015/2016 Sitemaps Exclusions
Hello fellow mozrs!
Intermediate & Advanced SEO | | artdivision
Been working on a few Property (Real Estate for our American friends) websites recently and and two questions that constantly come up as we spec the site are: 1. What schema (schema.org) should the website use (throughout all pages as well as individual pages). Did anyone found that schema actually helped with their ranking/CTR?
2. Whilst setting up the sitemaps (usually Yaost is our preferred plugin for the job), what page would you EXCLUDE from the site map? Looking forward to some interesting comments.
Dan.0 -
Sitemap error
Hey Guys Everytime I run the tester through google webmaster tools - I keep getting an error that tells me "Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead." An idea how to go about fixing this without changing the site around? https://www.zenory.co.nz/sitemap I have seen competitors sitemaps look similar to mine. Cheers
Intermediate & Advanced SEO | | edward-may0 -
301 redirect from .html to non .html?
Previously our site was using this as our URL structure: www.site.com/page.html. A few months ago we updated our URL structure to this: www.site.com/page & we're not using the .html. I've read over this guide & don't see anywhere that discusses this: http://www.seomoz.org/learn-seo/redirection. I've currently got a programmer looking into, but am always a bit weary with their workarounds, as I'd previously had them cause more problems then fix it. Here is the solution he is looking to do: The way that I am doing the redirect is fine. The problem is of where to put the code. The issue is that the files are .html files that need to be redirected to the same url with out a .html on them. I can see if I can add that to the 404 redirect page if there is one inside of there and see if that does the trick. That way if there is no page that exists without the .html then it will still be a 404 page. However if it is there then it will work as normal. I will see what I can find and get back. Any help would be greatly appreciated. Thanks, BJ
Intermediate & Advanced SEO | | seointern0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280