Best XML Sitemap generator
-
Do you guys have any suggestions on a good XML Sitemaps generator? hopefully free, but if it's good i'd consider paying
I am using a MAC so would prefer a online or mac version
-
Hi James - i saw your reply on this thread and a quick question - i was running Gsitecrawler, after selecting all the suitable options , it opens up a "Crawl watch" page. While I am assuming it is crawling the site, as per the online instruction it says to select the "Generate" tab at the main application window (I did not opt for auto ftp).
When should I select the Generate option, immediately or wait for crawl to complete?
suparno
-
The only way to find out is to shoot them an e-mail. Either way you will discover the answer
-
I am wondering if they are talking about the paid version cus I run it on my site. www.psbspeakers.com and it comes up with all kinds of dup content.
<loc>http://www.psbspeakers.com/products/image/Image-B6-Bookshelf</loc>
<loc>http://www.psbspeakers.com/products/bookshelf-speakers/Image-B6-Bookshelf</loc>with this code siteing on both pages:
<link rel="canonical" href="http://www.psbspeakers.com/products/image/Image-B6-Bookshelf"/> -
I am wondering if they are talking about the paid version cus I run it on my site. www.psbspeakers.com and it comes up with all kinds of dup content.
<loc>http://www.psbspeakers.com/products/image/Image-B6-Bookshelf</loc>
<loc>http://www.psbspeakers.com/products/bookshelf-speakers/Image-B6-Bookshelf</loc>with this code siteing on both pages:
<link rel="canonical" href="http://www.psbspeakers.com/products/image/Image-B6-Bookshelf"/> -
I e-mailed their support and they shared it does support canonical tags. Below is the response I received:
Hi,
The script will detect canonical tags. If you can provide a live example we can look into for you.Regards,PhilipXML-Sitemaps.com-----------------------------I would suggest ensuring your tags are valid. If they are, contact the site support and they can provide specific feedback.
-
Thanks Ryan.
That's the one I already use, but it does not take canonical's into account so i end up with 2-3 links for the same page.
-
A popular sitemap generator: http://www.xml-sitemaps.com/
I cannot say it is the best but rather it works fine. The free online version will scan 500 pages. For $20, you can then have unlimited number of pages.
-
Sorry I should have said... I am on a mac ;(
is there any online ones around that don't have a cap of 500 pages? -
GsiteCrawler every time. It's free and It's an awesome awesome tool http://gsitecrawler.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best format for animated content
We want to use some movement in our designs, charts etc. what format is the most SEO friendly?
Technical SEO | | remkoallertz1 -
Sitemap links
Hi, I´m running a sitemap using pro-sitemaps and I find several pages that shouldn´t be listed. How do I find how are these pages being generated? Can´t find the links the robot is following to get to those pages..
Technical SEO | | ceci27100 -
Will sitemap generated in Yoast for a combined wordpress/magento site map entire site ?
Hi For an ecommerce site thats been developed via a combination of wordpress and magento and has yoast installed, will the sitemap (& other yoast features) map (& apply to) the entire site or just wordpress aspects ? In other words does one need to do anything else to have a full sitemap for a combined magento/wordpress site or will Yoast cover it all ? This link seems to suggest should be fine but seeing if anyone else encountered this and had problems or if straightforward ? http://fishpig.co.uk/wordpress-integration/docs/plugins.html cheers dan
Technical SEO | | Dan-Lawrence0 -
Is there a suggested limit to the amount of links on a sitemap?
Currently, I have an error on my moz dashboard indicating there are too many links on one of my pages. That page is the sitemap. It was my understanding all internal pages should be linked to the sitemap. Can any mozzers help clarify the best practice here? Thanks, Clayton
Technical SEO | | JorgeUmana0 -
Omitting URLs from XML Sitemap - Bad??
Hi all, We are working on an extremely large retail site with some major duplicate content issues that we are in the process of remedying. The site also does not currently have an XML sitemap. Would it be advisable to create a small XML sitemap with only the main category pages for the time being, and then after our duplicate content issues are resolved, uploading the complete sitemap? Or should we wait to upload anything until all work is complete down to the product page level and canonicals are in place? Will uploading a incomplete sitemap be fraudulent or misleading in the eyes of the search engines and prompt a penalty, or would having at least the main pages mapped while we continue work be okay? Please let me know if more info is needed to answer! Thanks in advance!
Technical SEO | | seo320 -
Duplicate Video Onsite - How do you treat this in Sitemap?
How would you handle multiple pages using the same video content? As sometimes it does not make sense to have new videos for every product so you re purpose. Will you still get the effects in search results if the thumbs and video location is duplicated for some product urls?
Technical SEO | | andrewv0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0 -
What are the best techniques for sub-menu?
Which techniques are "SEO-Friendly" for creating a sub-menu when you have to go hover a menu to discover the sub-menu? Best regards, Jonathan
Technical SEO | | JonathanLeplang0