What is the best tool for getting a SiteMap url for a website with over 4k pages?
-
I have just migrated my website from HUGO to Wordpress and I want to submit the sitemap to Google Search Console (because I haven't done so in a couple years). It looks like there are many tools for getting a sitemap file built. But I think they probably vary in quality. Especially considering the size of my site.
-
Screaming Frog is good for crawling an existing Sitemap.xml file and can indeed produce Sitemap.xml files, but if our site is medium-sized (thousands of URLs) then really you'd want a dynamic one. Pretty sure the Yoast SEO plugin for WordPress has this built in with some tweak options and variables, probably start there
With Screaming Frog you'd have to keep manually re-building your sitemap XML / XML index file. Sounds pointless, boring and tedious when relatively stable dynamic options exist
-
As Keen already stated, Screaming From is the best option
-
We use Screaming Frog and we are about 80k pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My SEO friend says my website is not being indexed by Google considering the keywords he has placed in the page and URL what does that mean?
My SEO friend says my website is not being indexed by Google considering the keywords he has placed in the page and URL what does that mean? We have added some text in the pages with keywords thats related the page
Technical SEO | | AlexisWithers0 -
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
How do I get my pages to go from "Submitted" to "Indexed" in Google Webmaster Tools?
Background: I recently launched a new site and it's performing much better than the old site in terms of bounce rate, page view, pages per session, session duration, and conversions. As suspected, sessions, users, and % new sessions are all down. Which I'm okay with because the the old site had a lot of low quality traffic going to it. The traffic we have now is much more engaged and targeted. Lastly, the site was built using Squarespace and was launched the middle of August. **Question: **When reviewing Google Webmaster Tools' Sitemaps section, I noticed it says 57 web pages Submitted, but only 5 Indexed! The sitemap that's submitted seems to be all there. I'm not sure if this is a Squarespace thing or what. Anyone have any ideas? Thanks!!
Technical SEO | | Nate_D0 -
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect?
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect? If this scenario requires a 301 redirect no matter what, I might as well update the URL to be a little more keyword rich for the page while I'm at it. However, since these pages are ranking well I'd rather not lose any authority in the process and keep the URL just stripped of the ".html" (if that's possible). Thanks for you help! [edited for formatting]
Technical SEO | | Booj0 -
Removing Redirected URLs from XML Sitemap
If I'm updating a URL and 301 redirecting the old URL to the new URL, Google recommends I remove the old URL from our XML sitemap and add the new URL. That makes sense. However, can anyone speak to how Google transfers the ranking value (link value) from the old URL to the new URL? My suspicion is this happens outside the sitemap. If Google already has the old URL indexed, the next time it crawls that URL, Googlebot discovers the 301 redirect and that starts the process of URL value transfer. I guess my question revolves around whether removing the old URL (or the timing of the removal) from the sitemap can impact Googlebot's transfer of the old URL value to the new URL.
Technical SEO | | RyanOD0 -
Redirecting website page to another
Hi there one of my old pages on my site is currently ranking for a phrases that I want to rank for on a new page I created. My old page from 1 year ago is ranking for 'Property Management Training' (it's a blog post dating 2011) I have cretaed a new main Page on my site and would like to rank for 'Property Management' as it's more relevant. What is the best suggestion to keep my ranking but have people go to my new page? 301 redirect old page to new page? Thanks,
Technical SEO | | daracreative1 -
H-tags and Page Name best practice
For the past few months I've been working on a new site launch, but have been left with a couple of annoyances from my predecessor.. I've got a couple of questions about best practise (and if it's worth changing now). For reference, a good example page is http://polestars.net/hen-party/life-drawing-hen-party/ H-tags The (external) web designer has insisted that wrapping the logo in an H1 tag (with the same branded H1 text on every page), and using an H2 for the actual title of the page is fine. I really don't believe him, but at the same time, feel like maybe google is smart enough to discern the theme of a page in this structure. Is it worth having this changed so that the actual title is the first H1? Page naming convention Another annoyance that I've been left with is the fact that every product page is named the same "burlesque hen party", "life drawing hen party", "whatever hen party"... It looks a little weird, but my real concern is that as we now have 60 "hen party" links in the navigation menu of a bunch of our pages, this may be seen as keyword stuffing - is this a real concern, or am I overthinking it?
Technical SEO | | AlecPR0 -
What is the best way to deal with pages whose content changes?
My site features businesses that offers activities for kids. Each business has its own page on my site. Business pages contains a listing of different activities that organization is putting on (such as events, summer camps, drop-in activities). Some businesses only offer seasonal activities (for example, during Christmas break and summer camps). The rest of the year, the business has no activities -- the page is empty. This is creating 2 problems. It's poor user experience (which I can fix no problem) but it also is thin content and sometimes treated as duplicate content. What's the best way to deal with pages whose content can be quite extensive at certain points of the year and shallow or empty at other parts? Should I include a meta ROBOTS tag to not index when there is no content, and change the tag to index when there is content? Should I just ignore this problem? Should I remove the page completely and do a redirect? Would love to know people's thoughts.
Technical SEO | | ChatterBlock0