Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
-
We have dynamically updated xml sitemaps which we feed to Google et al.
Our xml sitemap is updated constantly, and takes minimal hands on management to maintain.
However we still have an html version (which we link to from our homepage), a legacy from back in the pre-xml days. As this html version is static we're finding it contains a lot of broken links and is not of much use to anyone.
So my question is this - does Google (or any other search engine) still need both, or are xml sitemaps enough?
-
From an SE point of view XML sitemaps are enough, if you have a large site you may want to consider having more than one sitemap for different categories.
As Kieron suggested HTML Sitemaps are useful for people to navigate your site it might be worthwhile writing some PHP to convert the XML into HTML and making your HTML Sitemap a little more dynamic?
-
Although users might not go to the sitemap very often, it is usually a very easy way to make sure some linkjuice is passed to all pages. Especially if the sitemap is linked to from a lot of pages, it usually has quite some juice to pass on. However, the sitemap should never be the only way you link to deeper pages.
-
The html sitemap should not be there for Google or Bing but for real people. If no one is using the html site map and it’s causing lots of problems to maintaining it, then drop it.
But if visitors are using it, look to get it replaced with a dynamics generated html sitemap, which helps the visitors find what they are looking for. You may wish to consider using similar to amazon’s “Shop all departments” page, which allows visitors to drill into the categories.
Hope this helps.
K
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you think profanity in the content can harm a site's rankings?
In my early 20's I authored an ebook that provides men with natural ways to improve their ahem... "bedroom performance". I'm now in my mid 30s, and while it's not such an enthralling topic, the thing makes me 80 or so bucks a day on good days, and it actually works. I update the blog from time to time and build links to it on occasion from good sources. I've carried my SEO knowledge to a more "reputable" business, but this project is still interesting to me, because it's fully mine. I am more interested in getting it to rank and convert than anything, but following the same techniques that are working to grow the other business, this one continues to tank. Disavow bad links, prune thin content.. no difference. However, one thing I just noticed now are my search queries in the reports. When I first started blogging on this, I was real loose with my tongue, and spoke quite frankly (and dirty to various degrees). I'm much more refined and professional in how I write now. However, the queries I'm ranking for... a lot of d words, c words (in the sex sense)... sounds almost pornographic. Think Google may be seeing this, and putting me lower in rankings or in some sort of lower level category because of it? Heard anything about google penalizing for profanity? I guess in this time of authority and trust, that can hurt both of those... but I wonder if anyone's heard any actual confirmation of this or has any experience with this? Thanks!
Algorithm Updates | | DavidCapital0 -
Does Coverage impact on SEO
Does the coverage issues on google search console ( Google Webmaster) has an impact on SEO ( CTR or impressions). How much of a difference or impact will fixing these have on Search results and average
Algorithm Updates | | Rishardg0 -
Dramatic drop in SEO rankings after recovering from hacking
A few months ago my client's website was hacked which created over 20,000+ spammy links on the site. I dealt with removing the malware and got google to remove the malware warning shortly within a week of the hacking. Then started the long process to do 301 redirects and disavowing links under Webmaster tools over these few months. The hacking only caused a slight drop in rankings at the time. Now just as of last week the site had a dramatic drop in rankings. When doing a keyword search I noticed the homepage doesn't even get listed on Google Maps and for Google Search instead the inner pages like the Contact Us page show up instead of the homepage. Does anyone have any insight to the sudden drop happening now and why the inner pages are ranking higher than the homepage now?
Algorithm Updates | | FPK0 -
Test site is live on Google but it duplicates existing site...
Hello - my developer has just put a test site up on Google which duplicates my existing site (main url is www.mydomain.com and he's put it up on www.mydomain.com/test/ "...I’ve added /test/ to the disallowed urls in robots.txt" is how he put it. So all the site URLs are content replicated and live on Google with /test/ added so he can block them in robots. In all other ways the test site duplicates all content, etc (until I get around to making some tweaks next week, that is). Is this a bad idea or should I be OK. Last thing I want is a duplicate content or some other Google penalty just because I'm tweaking an existing website! Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
Ranking Drop After Switching Sites
I have a client who's rankings dropped after switching to out site. We know that rankings can drop a little after switching, but we are concerned that hers are still low. Any suggestions? As far as I can tell, the links to her site remained the same. Thanks Holly
Algorithm Updates | | hwade1 -
If the homepage is sandboxed for a keyword is the whole site sandboxed for that keyword?
If the homepage of a website has been sandboxed for certain keywords does this mean that the whole site is sandboxed for them keywords or just the homepage? If a new sub-page was created with quality unique content, would it be possible to get that sub-page ranked for the same keywords that have been sandboxed on the homepage? I have asked many other SEO professionals this same question and nobody really knows for sure. Do you?
Algorithm Updates | | Mark A Preston0 -
How to speed up indexing of my site...
Only 4 out of the 12 pages of my blog/site have been indexed. How can I ensure all the pages get indexed? I'm using a wordpress site, and I also wondered how could I speed the indexing process up (I have submitted a site map) Thanks!
Algorithm Updates | | copywritingbuzz0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0