Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best XML Sitemap generator
-
Do you guys have any suggestions on a good XML Sitemaps generator? hopefully free, but if it's good i'd consider paying
I am using a MAC so would prefer a online or mac version
-
Hi James - i saw your reply on this thread and a quick question - i was running Gsitecrawler, after selecting all the suitable options , it opens up a "Crawl watch" page. While I am assuming it is crawling the site, as per the online instruction it says to select the "Generate" tab at the main application window (I did not opt for auto ftp).
When should I select the Generate option, immediately or wait for crawl to complete?
suparno
-
The only way to find out is to shoot them an e-mail. Either way you will discover the answer
-
I am wondering if they are talking about the paid version cus I run it on my site. www.psbspeakers.com and it comes up with all kinds of dup content.
<loc>http://www.psbspeakers.com/products/image/Image-B6-Bookshelf</loc>
<loc>http://www.psbspeakers.com/products/bookshelf-speakers/Image-B6-Bookshelf</loc>with this code siteing on both pages:
<link rel="canonical" href="http://www.psbspeakers.com/products/image/Image-B6-Bookshelf"/> -
I am wondering if they are talking about the paid version cus I run it on my site. www.psbspeakers.com and it comes up with all kinds of dup content.
<loc>http://www.psbspeakers.com/products/image/Image-B6-Bookshelf</loc>
<loc>http://www.psbspeakers.com/products/bookshelf-speakers/Image-B6-Bookshelf</loc>with this code siteing on both pages:
<link rel="canonical" href="http://www.psbspeakers.com/products/image/Image-B6-Bookshelf"/> -
I e-mailed their support and they shared it does support canonical tags. Below is the response I received:
Hi,
The script will detect canonical tags. If you can provide a live example we can look into for you.Regards,PhilipXML-Sitemaps.com-----------------------------I would suggest ensuring your tags are valid. If they are, contact the site support and they can provide specific feedback.
-
Thanks Ryan.
That's the one I already use, but it does not take canonical's into account so i end up with 2-3 links for the same page.
-
A popular sitemap generator: http://www.xml-sitemaps.com/
I cannot say it is the best but rather it works fine. The free online version will scan 500 pages. For $20, you can then have unlimited number of pages.
-
Sorry I should have said... I am on a mac ;(
is there any online ones around that don't have a cap of 500 pages? -
GsiteCrawler every time. It's free and It's an awesome awesome tool http://gsitecrawler.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.xml sitemap showing in SERP
Our sitemap is showing in Google's SERP. While it's only for very specific queries that don't seem to have much value (it's a healthcare website and when a doctor who isn't with us is search with the brand name so 'John Smith Brand,' it shows if there's a first or last name that matches the query), is there a way to not make the sitemap indexed so it's not showing in the SERP. I've seen the "x-robots-tag: noindex" as a possible option, but before taking any action wanted to see if this was still true and if it would work.
Technical SEO | | Kyleroe950 -
If I'm using a compressed sitemap (sitemap.xml.gz) that's the URL that gets submitted to webmaster tools, correct?
I just want to verify that if a compressed sitemap file is being used, then the URL that gets submitted to Google, Bing, etc and the URL that's used in the robots.txt indicates that it's a compressed file. For example, "sitemap.xml.gz" -- thanks!
Technical SEO | | jgresalfi0 -
Best practices for types of pages not to index
Trying to better understand best practices for when and when not use a content="noindex". Are there certain types of pages that we shouldn't want Google to index? Contact form pages, privacy policy pages, internal search pages, archive pages (using wordpress). Any thoughts would be appreciated.
Technical SEO | | RichHamilton_qcs0 -
Indexing product attributes in sitemap
Hey Mozzers! I'm battling a few questions about the sitemap for my ecommerce store. Could you help me out? Is it necessary to include your product attributes in the sitemap? I'm not sure why it would matter to have a sitemap that lists everything in the color cherry. Also, if the attributes were included in the sitemap, would that count as duplicate content for the same products to show up in multiple attributes? Is there any benefit to submitting the sitemaps individually? For example, submitting /product-sitemap.xml, /product_brand-sitemap.xml versus just /sitemap.xml? Any other best practices for managing my ecommerce sitemap, or great resources, would be very helpful. Thank you! a1vUz
Technical SEO | | localwork0 -
Upgrade old sitemap to a new sitemap index. How to do without danger ?
Hi MOZ users and friends. I have a website that have a php template developed by ourselves, and a wordpress blog in /blog/ subdirectory. Actually we have a sitemap.xml file in the root domain where are all the subsections and blog's posts. We upgrade manually the sitemap, once a month, adding the new posts created in the blog. I want to automate this process , so i created a sitemap index with two sitemaps inside it. One is the old sitemap without the blog's posts and a new one created with "Google XML Sitemap" wordpress plugin, inside the /blog/ subdirectory. That is, in the sitemap_index.xml file i have: Domain.com/sitemap.xml (old sitemap after remove blog posts urls) Domain.com/blog/sitemap.xml (auto-updatable sitemap create with Google XML plugin) Now i have to submit this sitemap index to Google Search Console, but i want to be completely sure about how to do this. I think that the only that i have to do is delete the old sitemap on Search Console and upload the new sitemap index, is it ok ?
Technical SEO | | ClaudioHeilborn0 -
Auto generated meta description tag in Drupal
Was having issues on Drupal not autogenerating a meta description tag, but I think I have figured it out. Just to verify, would this piece of code be the meta description tag (reason I ask is b/c it looks a little different than the usual tag I have seen):
Technical SEO | | kevgrand0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Sitmap Page - HTML and XML
Hi there I have a domain which has a sitemap in html for regular users and a sitemap in xml for the spiders. I have a warning via seomoz saying that i have too many links on the html version. What do i do here? regards Stef
Technical SEO | | stefanok0