XML Sitemap Generators
-
I am looking to use a different sitemap generator that can do 5 thousand or more pages at once.
Any recommendations?
Thanks guys.
-
The following post might be helpful - http://moz.com/community/q/online-sitemap-generator
You can also use Screaming Frog to create sitemaps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Please help, (going bananaz) trying to trouble shoot sitemap submitted to Bing
We need help to figure out what seems to be an error in our sitemap.
Technical SEO | | IMSvintagephotos
We have submitted the sitemap to BING and the sitemap includes 1,2 million pages that should be crawled. After initial submission, Bing says in the dashboard that 1,2 million pages have been submitted. Then always after 2-4 days the number drops to either 500.000 pages or like now 250.000 pages. Why is that? is there an error in our sitemap and BING in excluding pages, and it lowers the submitted number after going through them and discovering the error ?. We need to figure this out and fix so that BING can crawl and index all our 1,2 million pages. See the screenshot showing the BING dashboard.
We are also having issues with google but we can't figure out what is going on. Here are the sitemaps: https://imsvintagephotos.com/google_sitemap/sitemap.xml and here: https://imsvintagephotos.com/sitemap.xml. Your website is www.imsvintagephotos.com qqp6gj0 -
Will an XML sitemap override a robots.txt
I have a client that has a robots.txt file that is blocking an entire subdomain, entirely by accident. Their original solution, not realizing the robots.txt error, was to submit an xml sitemap to get their pages indexed. I did not think this tactic would work, as the robots.txt would take precedent over the xmls sitemap. But it worked... I have no explanation as to how or why. Does anyone have an answer to this? or any experience with a website that has had a clear Disallow: / for months , that somehow has pages in the index?
Technical SEO | | KCBackofen0 -
Sitemap all of a sudden only indexing 2 out of 5000+ pages
Any ideas why this happened? Our sitemap looks the same. Also, our total number of pages indexed has not decreased, just the sitemap. Could this eventually affect my pages being in the index?
Technical SEO | | rock220 -
XML Sitemap Issue or not?
Hi Everyone, I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues. Issue: Url blocked by robots.txt. Description: Sitemap contains urls which are blocked by robots.txt. Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml Value: http://www.example.org/author/admin/ My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked. Do you think i m having a major problem or everything is fine?What should I do? How can I fix it? FYI: Wordpress is what we use for our website Thanks
Technical SEO | | Tay19860 -
How would you create and then segment a large sitemap?
I have a site with around 17,000 pages and would like to create a sitemap and then segment it into product categories. Is it best to create a map and then edit it in something like xmlSpy or is there a way to silo sitemap creation from the outset?
Technical SEO | | SystemIDBarcodes0 -
Benefits to having an HTML sitemap?
We are currently migrating our site to a new CMS and in part of this migration I'm getting push-back from my development team regarding the HTML sitemap. We have a very large news site with 10s of thousands of pages. We currently have an HTML sitemap that greatly helps with distributing PR to article pages, but is not geared towards the user. The dev team doesn't see the benefit to recreating the HTML sitemap despite my assurance that we don't want to lose all these internal links since removing 1000s of links could have a negative impact on our Domain Authority. Should I give in and concede the HTML sitemap since we have an XML one? Or am I right that we don't want to get rid of it?
Technical SEO | | BostonWright0 -
Duplicate Video Onsite - How do you treat this in Sitemap?
How would you handle multiple pages using the same video content? As sometimes it does not make sense to have new videos for every product so you re purpose. Will you still get the effects in search results if the thumbs and video location is duplicated for some product urls?
Technical SEO | | andrewv0 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0