Does anyone have suggestions for a good XML Sitemap Generator?
-
Does anyone have any suggestions on a good XML Sitemap Generator? Also interested in best practices and tips for updating the XML Sitemap.
I typically have relied on my web developers to do this however it seems that they have not been setting this up with SEO in mind.
-
Google will look for /sitemaps.xml. This is pretty standard.
Some people may not name it that way - not advised. But if you're using the other sitemaps for images, videos, etc, then you can use the robots.txt file to tell search engines where they are explicitly.
Also, be sure to set up an account at http://www.google.com/webmasters/ and submit your sitemaps there.
-
Thanks for the info! I'm working on loading up the xml-sitemaps software now.
What is the purpose of a link to all the sitemaps in a robots.txt file?
-
If you have ftp access to the root, you can simply overwrite the current one with a new one. Google will pick it up.
Also, you should include a link to all your sitemaps in a robots.txt file - also in the root. If you look at SEOMoz's robots.txt file, you can see how to do that. http://www.seomoz.org/robots.txt
-
there is already a sitemap on my server that was generated by my web developers. What is the best practice for overriding an existing site map?
-
Thank you Cyril. Based on my research as well this seems to be a good option. Google webmaster tools also recommends it.
-
Thanks Jbrock!
That really helps to see how yours is set up. My website developer currently has a sitemap set up however I don't think its taking advantage of all the features possible.
-
You can probably set it to ignore parameters you don't want added to the sitemap. Here's a screenshot from my standalone version to give you an idea of the settings.
-
I bought the standalone version of http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html in 2007 or 8 and have really liked it. They've really kept the product updated, but the $19.99 price hasn't changed since.
It gives you a high degree of control, supports really large websites, generates multiple sitemaps (xml, image, video, news, mobile, ror, etc.) and more. Plus, updates are free for life once you purchase.
Use the free generator on the homepage to see how it works.
-
Hey Nathan,
I've used this one in the past <cite>www.xml-sitemaps.com/ </cite>
It works pretty decently for me, but you have to make sure you edit it after its created. Look for pages that no longer exist or are redirected somewhere else. Also make sure you aren't listing any pages with search query's in them, (www.url.com/index.php**?xxxx**) if you use search query's that way.
Updating is just what it is, every time you create a new page on your site, you need to update the site map.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Best Practices
My question is regarding the URL structure best practices of a sitemap. My website allows search any number of ways, i.e. 1. http://www.website.com/category/subcategory/product 2. http://www.website.com/subcategory/product 3. http://www.website.com/product However, I am not sure which structure to use in the sitemap (which is being written manually). I know that for SEO purposes the 3rd option is best as the link is more relevant to that individual product, but the Moz tool states that the home page should have less than 100 links (although Google doesn't penalise for having more) and by writing my entire site in the 3rd way it would result in a lot more links adjoining to the home page. It is either the 2nd or 3rd option, I think, as the 1st category is not keyword specific (rather a generic term, i.e. novelties). Does anyone have experience with this?
Moz Pro | | moon-boots0 -
Whether this is a good informational keyword (for ads)
Hello, Learning about informational keywords. Is Nikola Tesla (300K-1M) a good informational keyword considering I'd only be (at the highest) on the 2nd page? There's no ads and I'm competing against big players, but the 2nd page has opportunities for ranking. I've been studying this guy a long time. Don't know if it's will bring ad revenue. What about chemtrails (118k-300k)
Moz Pro | | BobGW0 -
I need to get a page in the top 3 Google results for my keyword "teaching jobs" but am struggling to do so! Can anyone help?
I'm trying to get this page http://www.eteach.com/teaching-jobs to rank as the top search result on Google with the keyword "teaching jobs" but it seems to be number 5 in the results! My competitors are totally kicking my arse on getting this page to be above my website. I've got the keywords in there, I have the right content and I have links, what more can I do to make it rank as number 1! Help please!! If anyone has an SEO check list of things I need to make sure I do on my pages for them to rank in the top 3 results then that would be really handy!
Moz Pro | | Eteach_Marketing0 -
Is Anyone Else Having Problems With The Ranking On Pro Tools?
After checking them from the report I was emailed, some of them seem to be incorrect, or is it something my end? To be fair the majority of them are correct, I'm just querying it.
Moz Pro | | JonathanRolande0 -
I'm looking for good outbound link profiling tool...
Hello all, Just a brief question... does anybody know of a good link profiling tool which will give me a list of URLs that are linked to from a specific website? Essentially, I'm looking for the polar opposite of OpenSiteExplorer as I'm not interested in the inbound links, only the outbound links. Thanks, Elias
Moz Pro | | A_Q0 -
A suggestion to help with linkscape crawling and data processing
Since you guys are understandably struggling with crawling and processing the sheer number of URLs and links, I came up with this idea: In a similar way to how SETI@Home (is that still a thing? Google says yes: http://setiathome.ssl.berkeley.edu/) works, could SEOmoz use distributed computing amongst SEO moz users to help with the data processing? Would people be happy to offer up their idle processor time and (optionally) internet connections to get more accurate, broader data? Are there enough users of the data to make distributed computing worthwhile? Perhaps those who crunched the most data each month could receive moz points or a free month of Pro. I have submitted this as a suggestion here:
Moz Pro | | seanmccauley
http://seomoz.zendesk.com/entries/20458998-crowd-source-linkscape-data-processing-and-crawling-in-a-similar-way-to-seti-home1 -
Meta description tag in rss xml file?
The SEOmoz crawl diagnostic tool is complaining that I'm missing a meta description tag from a file that is an RSS xml file. In my <channel>section I do have a <description>tag. Is this a bug in the SEOmoz tool or do I need to add another tag to satisify the warning?</description></channel>
Moz Pro | | scanlin0 -
Feature Suggestions
Along with the current crawl notices (canonical links, 301s, etc) you should add "meta refresh". Those are tricky to catch via the human eye unless you are paying very close attention and they definitely effect your SEO if not done properly. Thoughts?
Moz Pro | | kchandler0