Best XML Sitemap Generator for Mac?
-
Hi all,
Recently moved from PC to Mac when starting a new job. One of the things I'm missing from my PC is G Site Crawler, and I haven't yet found a decent equivalent for the Mac.
Can anybody recommend something as good as G Site Crawler for the Mac? I.e. I need the flexibility to exclude by URL parameter etc etc.
Cheers everyone,
Mark
-
Thanks Crimson Penguin (really hope that's your real name!)
I have used sitemapdoc.com in the past, only problem is it limits you to 500 URLs. Really wish G Site would do a Mac version - same with Xenu, there's just nothing out there to do a job as well as those two on the Mac.
Cheers for your feedback - hoping somebody else can come up with something golden?
-
Of course I should also add, if you really want G Site Crawler, you could run a virtual windows on your Mac and carry on using it that way. I run windows on my mac using VMware Fusion:
-
One of my favourites is an online generator called sitemapdoc:
I believe it has all the features you are looking for.
Adam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Adding https version of website: how best to redirect
If I have 4 versions of my site http://www
Technical SEO | | bhsiao
http://
https://www
https:// What is the best way to redirect without losing seo positions? i have been mainly using http://www but have recently added my ssl so https works also. I heard at Moz Con that I should get the https working. All of my marketing and ads are going to http://www 301 redirect 3 of them? Which 3? If https is becoming important, should that be my main url? will it hurt my seo to switch? Thank you so much in advance!0 -
My video sitemap is not being index by Google
Dear friends, I have a videos portal. I created a video sitemap.xml and submit in to GWT but after 20 days it has not been indexed. I have verified in bing webmaster as well. All videos are dynamically being fetched from server. My all static pages have been indexed but not videos. Please help me where am I doing the mistake. There are no separate pages for single videos. All the content is dynamically coming from server. Please help me. your answers will be more appreciated................. Thanks
Technical SEO | | docbeans0 -
Which Sitemap to keep - Http or https (or both)
Hi, Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor). FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https. Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good. Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS. But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.
Technical SEO | | ashishb010 -
Best on-line tool for checking indexed pages (or just for a Mac)
Hey guys, I'm on a Mac and that's why I can't use the usual PC software for checking if my links have been indexed. Here's the deal. I ordered some guest posts. The guest poster did it for me and put my back links. Now, I want to quickly check which pages (with my backlinks) have been indexed. I have a lot of guest posts. So, I need something that can check if those pages have been indexed by Google. I need an online tool or something that will work for my Mac. Help. 🙂
Technical SEO | | VinceWicks0 -
Industry News Page Best Practices
Hi, We have created an industry news page which automatically curates articles from specific news sources within our sector. Currently, I have the news index page set to be indexed and followed by robots. I have the article pages noindex, nofollow, since these are not original content. Is this the best practice or do you recommend another configuration? Thanks!
Technical SEO | | JoshGFialkoff0 -
ECommerce: Best Practice for expired product pages
I'm optimizing a pet supplies site (http://www.qualipet.ch/) and have a question about the best practice for expired product pages. We have thousands of products and hundreds of our offers just exist for a few months. Currently, when a product is no longer available, the site just returns a 404. Now I'm wondering what a better solution could be: 1. When a product disappears, a 301 redirect is established to the category page it in (i.e. leash would redirect to dog accessories). 2. After a product disappers, a customized 404 page appears, listing similar products (but the server returns a 404) I prefer solution 1, but am afraid that having hundreds of new redirects each month might look strange. But then again, returning lots of 404s to search engines is also not the best option. Do you know the best practice for large ecommerce sites where they have hundreds or even thousands of products that appear/disappear on a frequent basis? What should be done with those obsolete URLs?
Technical SEO | | zeepartner1 -
How do i Organize an XML Sitemap for Google Webmaster Tools?
OK, so i used am xlm sitemap generator tool, xml-sitemaps.com, for Google Webmaster Tools submission. The problem is that the priorities are all out of wack. How on earth do i organize it with 1000's of pages?? Should i be spending hours organizing it?
Technical SEO | | schmeetz0 -
Double byte characters in the URL - best avoided?
We are doing some optimisation on sites in the APAC region, namely China, Hong Kong, Taiwan and Japan. We have set the url generator to automatically use the heading of the page in the URL which works fine for countries using Latin characters, but is causing problems, particularly in IE, when it comes to the double byte countries. For some reason, IE struggles with double byte and displays URLs in their rather ugly, coded form. Anybody got any suggestions on whether we should persist with the keyword URLs or revert to the non-descriptive URLs for the double byte countries? The reason I ask is it's a balance of SEO benefit vs not scaring IE users off with ugly URLs that look dreadful and spammy.
Technical SEO | | Red_Mud_Rookie0