Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best XML Sitemap generator
-
Do you guys have any suggestions on a good XML Sitemaps generator? hopefully free, but if it's good i'd consider paying
I am using a MAC so would prefer a online or mac version
-
Hi James - i saw your reply on this thread and a quick question - i was running Gsitecrawler, after selecting all the suitable options , it opens up a "Crawl watch" page. While I am assuming it is crawling the site, as per the online instruction it says to select the "Generate" tab at the main application window (I did not opt for auto ftp).
When should I select the Generate option, immediately or wait for crawl to complete?
suparno
-
The only way to find out is to shoot them an e-mail. Either way you will discover the answer

-
I am wondering if they are talking about the paid version cus I run it on my site. www.psbspeakers.com and it comes up with all kinds of dup content.
<loc>http://www.psbspeakers.com/products/image/Image-B6-Bookshelf</loc>
<loc>http://www.psbspeakers.com/products/bookshelf-speakers/Image-B6-Bookshelf</loc>with this code siteing on both pages:
<link rel="canonical" href="http://www.psbspeakers.com/products/image/Image-B6-Bookshelf"/> -
I am wondering if they are talking about the paid version cus I run it on my site. www.psbspeakers.com and it comes up with all kinds of dup content.
<loc>http://www.psbspeakers.com/products/image/Image-B6-Bookshelf</loc>
<loc>http://www.psbspeakers.com/products/bookshelf-speakers/Image-B6-Bookshelf</loc>with this code siteing on both pages:
<link rel="canonical" href="http://www.psbspeakers.com/products/image/Image-B6-Bookshelf"/> -
I e-mailed their support and they shared it does support canonical tags. Below is the response I received:
Hi,
The script will detect canonical tags. If you can provide a live example we can look into for you.Regards,PhilipXML-Sitemaps.com-----------------------------I would suggest ensuring your tags are valid. If they are, contact the site support and they can provide specific feedback.
-
Thanks Ryan.
That's the one I already use, but it does not take canonical's into account so i end up with 2-3 links for the same page.
-
A popular sitemap generator: http://www.xml-sitemaps.com/
I cannot say it is the best but rather it works fine. The free online version will scan 500 pages. For $20, you can then have unlimited number of pages.
-
Sorry I should have said... I am on a mac ;(
is there any online ones around that don't have a cap of 500 pages? -
GsiteCrawler every time. It's free and It's an awesome awesome tool http://gsitecrawler.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does changing sitemaps affect SEO
Hi all, I have a question regarding changing the size of my sitemaps. Currently I generate sitemaps in batches of 50k. A situation has come up where I need to change that size to 15k in order to be crawled by one of our licensed services. I haven't been able to find any documentation on whether or not changing the size of my sitemaps(but not the pages included in them) will affect my rankings negatively or my SEO efforts in general. If anyone has any insights or has experienced this with their site please let me know!
Technical SEO | | Jason-Reid0 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
Why xml generator is not detecting all my urls?
Hi Mozzers, After adding 3 new pages to example.com, when generating the xml sitemap, Iwasn't able to locate those 3 new url. This is the first time it is happening. I have checked the meta tags of these pages and they are fine. No meta robots setup! Any thoughts or idea why this is happening? how to fix this? Thanks!
Technical SEO | | Ideas-Money-Art0 -
Best Practice on 301 Redirect - Images
We have two sites that sell the same products. We have decided to retire one of the sites as we'd like to focus on one property. I know best practice is to redirect apples to apples, which in our case is easily done since the sites sold the same thing. www.SiteABC.com/ProductA can be redirected to www.SiteXYZ.com/ProductA. My question is how far does that thinking go regarding images? Each product has a main product page, of course, and then up to 6 images in some cases. Is it necessary to redirect www.SiteABC.com/ProductA-Image1.jpg to www.SiteXYZ.com/ProductA-Image1.jpg? Or can they all be redirected to just the product page?
Technical SEO | | Natitude0 -
Is it bad to have same page listed twice in sitemap?
Hello, I have found that from an HTML (not xml) sitemap of a website, a page has been listed twice. Is it okay or will it be considered duplicate content? Both the links use same anchor text, but different urls that redirect to another (final) page. I thought ideal way is to use final page in sitemap (and in all internal linking), not the intermediate pages. Am I right?
Technical SEO | | StickyRiceSEO1 -
Ror.xml vs sitemap.xml
Hey Mozzers, So I've been reading somethings lately and some are saying that the top search engines do not use ror.xml sitemap but focus just on the sitemap.xml. Is that true? Do you use ror? if so, for what purpose, products, "special articles", other uses? Can sitemap be sufficient for all of those? Thank you, Vadim
Technical SEO | | vijayvasu0 -
What is the best website structure for SEO?
I've been on SEOmoz for about 1 month now and everyone says that depending on the type of business you should build up your website structure for SEO as 1st step. I have a new client click here ( www version doesn't work)... some bugs we are fixing it now. We are almost finished with the design & layout. 2nd question have been running though my head. 1. What would the best url category for the shop be /products/ - current url cat ex: /products/door-handles.html 2. What would you use for the main menu as section for getting the most out of SEO. Personally i am thinking of making 2-3 main categories on the left a section where i can add content to it (3-4 paragraphs... images maybe a video).So the main page focuses on the domain name more and the rest of the sections would focus on specific keywords, this why I avoid cannibalization. Main keyword target is "door handles" Any suggestions would be appreciated.
Technical SEO | | mosaicpro0 -
Double byte characters in the URL - best avoided?
We are doing some optimisation on sites in the APAC region, namely China, Hong Kong, Taiwan and Japan. We have set the url generator to automatically use the heading of the page in the URL which works fine for countries using Latin characters, but is causing problems, particularly in IE, when it comes to the double byte countries. For some reason, IE struggles with double byte and displays URLs in their rather ugly, coded form. Anybody got any suggestions on whether we should persist with the keyword URLs or revert to the non-descriptive URLs for the double byte countries? The reason I ask is it's a balance of SEO benefit vs not scaring IE users off with ugly URLs that look dreadful and spammy.
Technical SEO | | Red_Mud_Rookie0