How Do I Generate a Sitemap for a Large Wordpress Site?
-
Hello Everyone!
I am working with a Wordpress site that is in Google news (i.e. everyday we have about 30 new URLs to add to our sitemap) The site has years of articles, resulting in about 200,000 pages on the site. Our strategy so far has been use a sitemap plugin that only generates the last few months of posts, however we want to improve our SEO and submit all the URLs in our site to search engines.
The issue is the plugins we've looked at generate the sitemap on-the-fly. i.e. when you request the sitemap, the plugin then dynamically generates the sitemap. Our site is so large that even a single request for our sitemap.xml ties up tons of server resources and takes an extremely long time to generate the sitemap (if the page doesn't time out in the process).
Does anyone have a solution?
Thanks,
Aaron
-
In my case, xml-sitempas works extremely good. I fully understand that a DB solution would avoid the crawl need, but the features that I get from xml-sitemaps are worth it.
I am running my website on a powerful dedicated server with SSDs, so perhaps that's why I'm not getting any problems plus I set limitations on the generator memory consumption and activated the feature that saves temp files just in case the generation fails.
-
My concern with recommending xml-sitemaps was that I've always had problems getting good, complete maps of extremely large sites. An internal CMS-based tool is grabbing pages straight from the database instead of having to crawl for them.
You've found that it gets you a pretty complete crawl of your 5K-page site, Federico?
-
I would go with the paid solution of xml-sitemaps.
You can set all the resources that you want it to have available, and it will store in temp files to avoid excessive consumption.
It also offers settings to create large sitemaps using a sitemap_index and you could get plugins that create the news sitemap automatically looking for changes since the last sitemap generation.
I have it running in my site with 5K pages (excluding tag pages) and it takes 10 minutes to crawl.
Then you also have plugins that create the sitemaps dynamically, like SEO by Yoast, Google XML Sitemaps, etc.
-
I think the solution to your server resource issue is to create multiple sitemaps, Aaron. Given that the sitemap protocol only allows 50,000 URLs max. per sitemap and Google News sitemaps can't be over 1000 URLs, this was going to be a necessity anyway, so may as well use these limitations to your advantage.
There's a functionality available for sitemaps called a sitemap index. It basically lists all the sitemap.xmls you've created, so the search engines can find and index them. You put it at the root of the site and then link to it in robots.txt just like a regular sitemap. (Can also submit it in GWT). In fact, Yoast's SEO plugin sitemaps and others use just this functionality already for their News add-on.
In your case, you could build the News sitemap dynamically to meet its special requirements (up to 1000 URLs and will crawl only last 2 days of posts) and to ensure it's up-to-the-minute accurate, as is critical for news sites.
Then separately you would build additional, segmented sitemaps for the existing 200,000 pages. Since these are historical pages, you could easily serve them from static files, since they wouldn't need to update once created. By having them static, there's be no server load to serve them each time - only the load to generate the current news sitemap. (I'd actually recommend you keep each static sitemap to around 25,000 pages each to ensure search engines can crawl them easily)
This approach would involve a bit of fiddling to initially set up, as you'd need to generate the "archive" sitemaps then convert them to static versions, but once set up, the News sitemap would take care of itself and once a month (or whatever you decide) you'd need to add the "expiring" pages from the News sitemap to the most recent "archive" segment. A smart programmer might even be able to automate that process.
Does this approach sound like it might solve your problem?
Paul
P.S. Since you'd already have the sitemap index capability, you could also add video and image sitemaps to your site if appropriate.
-
Have you ever tried using a web-based sitemap generator? Not sure how it would respond to your site but at least it would be running on someone else's server, right?
Not sure what else to say honestly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Not Ranking- No Reason Why
I have a client with a HUGE website who should be ranking for it's competitive keywords. No penalties, or bad links. Old domain. Not ranking for anything. Client has a huge AdWords spend and my theory is that it's not ranking organically because of the AdWords spend. I can't think of anything other reason. Anyone? Thanks. The keywords I'm trying to rank this client for aren't even competitive.
Intermediate & Advanced SEO | | 01023450 -
Is it Wortwhile to have a HTML site map for a Large Site
We are a large, enterprise site with many pages (some on our CMS and some old pages that exist outside our CMS). Every month we submit various an XML site map. Some pages on our site can no longer be found via following links from one page to another (orphan pages). Some of those pages are important and some not. Is it worth our while to create a HTML site map? Does any one have any recent stats or blog posts to share, showing how a HTML site map may have benefited a large site. Many thanks
Intermediate & Advanced SEO | | CeeC-Blogger0 -
After Receiving a "Googlebot can't access your site" would this stop your site from being crawled?
Hi Everyone,
Intermediate & Advanced SEO | | AMA-DataSet
A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.0 -
Sitemap.xml
Hi guys I read the seomoz article about sitemap.xml dated 2008. Just wanted to check views on: Is it worthwhile using the 'priority' What if everything is set to 100% Any tips to using the priority Many thanks in advance! Richard
Intermediate & Advanced SEO | | Richard5550 -
Redirection to mobile site
Calling all SEO ninjas! I'm currently developing single web pages for various clients which function as abbreviated versions of their main websites. They are all related & under a single domain. When a user visits these pages on a mobile device, CSS is used to display mobile friendly versions of these pages. My clients are thrilled with these mobile versions and now want to also redirect mobile visitors from their main site (which is not mobile optimised) to these pages. My questions are: Are there any negative implications if we did this? ie. redirecting to a different domain What is the best method for redirection? eg. JavaScript Can this be achieved by adding a single line of code to their main site Can this be done in an SEO friendly way so that the redirection acts like a backlink? Many thanks.
Intermediate & Advanced SEO | | martyc0 -
Wordpress Duplicate Content
We have recently moved our company's blog to Wordpress on a subdomain (we utilize the Yoast SEO plugin). We are now experiencing an ever-growing volume of crawl errors (nearly 300 4xx now) for pages that do not exist to begin with. I believe it may have something to do with having the blog on a subdomain and/or our yoast seo plugin's indexation archives (author, category, etc) --- we currently have Subpages of archives and taxonomies, and category archives in use. I'm not as familiar with Wordpress and the Yoast SEO plugin as I am with other CMS' so any help in this matter would be greatly appreciated. I can PM further info if necessary. Thank you for the help in advance.
Intermediate & Advanced SEO | | BethA0 -
What happened to my site?
This is my website: http://goo.gl/Brekn It was related to health niche. I am good with the serp for the last couple of hours. I am in top for the keywords: cure for piles, hemorrhoids treatment, natural hemorrhoid treatment. My main keyword is hemorrhoids treatment. And has recently added this page, http://goo.gl/SXNtX This page has some dupe content, as content writer was busy i have copied the content from other sources. Does this have any effect? I have not added this page onto homepage as i dont like google to index it until my copywriter writes the content. Dont know what happened, in the last few hours i lost my serp to no where. Can anyone tell me what happened to my site. Did i done anything wrong? In the last week i have added a link to my ebooks related site. Does this affected my serps? Or did the Google recent change http://googlewebmastercentral.blogspot.com/2012/01/page-layout-algorithm-improvement.html Has any effect on my site? Awaiting for your replies...
Intermediate & Advanced SEO | | Dex324344320 -
Can this site be optimised?
I have been told that because of the technology this site was developed with it cannot be changed for example urls title and meta tags cannot be changed. why is that and what other types of sites also cannot be changed. http://www.alliedpickfords.com/Pages/Landing.aspx For example i have been told alot of online stores cannot be optimised because the urls change every time some one goes to the page therefor you cant lionk to a certain page is that true and what is the way around it if any.
Intermediate & Advanced SEO | | duncan2740