Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Exclude Child URLs from XML Sitemap Generator (Wordpress)
-
Hi all,
I was recommended the XML Sitemap Generator for Wordpress by the very helpful Keith Bloemendaal and John Pring - however I can't seem to exclude child URLs.
There is a section Exclude items and a subsection Exclude posts. I have tried inputting the URLs for the pages I don't want in the sitemap, however that didn't work. So I read that you have to include a list of "IDs" - not sure where on earth to find that info, tried the page name and the post= number from the URL, however neither worked.
I hope somebody can point me in the right direction - and apologies, I am a Wordpress novice, and I got no answers from the Wordpress forums so turned right back to SEOmoz!
Cheers.
-
AH! You did it Keith - I thought clicking 'update' at the bottom would do it, but there's a little link hidden in some text at the top saying "rebuild the sitemap manually".
Finally it's done, thanks so much for your help!
Mark
-
Did you try to generate a new sitemap after clicking update options and then submitting it to webmaster tools?
Generally it will only update when you add/delete pages on it's own.
-
I'm just trying to exclude these child URLs from the sitemap - in future I may block them entirely, but I certainly don't want to submit a sitemap with these URLs and then contradict that in robots.txt.
I have used the Post ID numbers to exclude the pages from the sitemap, however they remain in place.
Thanks once again for your assistance and quick responses!
-
It may take some time for it to propagate to Google if that is what you are asking. Are you trying to block the pages/posts completely from search engines?
-
Hi Keith,
Thanks once again for a quick response. I have actually tried that method, however when I check the live sitemap I can still see the pages in my sitemap. Very frustrating! Is it that the sitemap doesn't update live straight away? And just to confirm, I am clicking "Update Options" at the bottom - quite often it'll be something stupid like that!
Thanks,
Mark
-
Great question, and WP really should make this easier!
http://businessaccent.com/2009/06/08/what-is-my-wordpress-post-id-number-and-how-can-i-find-it/ This article explains one way to see it, also if you open up the post/page in the admin panel to edit it you can just look in your browser to see the url which will have the post ID in it... IE: www.yoursite.com/wp-admin/post.php?post=615&action=edit (615 is the post ID)
Hope that helped
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirects, Sitemaps and Indexing - How to hide redirected urls from search engines?
We have several pages in our site like this one, http://www.spectralink.com/solutions, which redirect to deeper page, http://www.spectralink.com/solutions/work-smarter-not-harder. Both urls are listed in the sitemap and both pages are being indexed. Should we remove those redirecting pages from the site map? Should we prevent the redirecting url from being indexed? If so, what's the best way to do that?
Technical SEO | | HeroDesignStudio0 -
301 Redirects Relating to Your XML Sitemap
Lets say you've got a website and it had quite a few pages that for lack of a better term were like an infomercial, 6-8 pages of slightly different topics all essentially saying the same thing. You could all but call it spam. www.site.com/page-1 www.site.com/page-2 www.site.com/page-3 www.site.com/page-4 www.site.com/page-5 www.site.com/page-6 Now you decided to consolidate all of that information into one well written page, and while the previous pages may have been a bit spammy they did indeed have SOME juice to pass through. Your new page is: www.site.com/not-spammy-page You then 301 redirect the previous 'spammy' pages to the new page. Now the question, do I immediately re-submit an updated xml sitemap to Google, which would NOT contain all of the old URL's, thus making me assume Google would miss the 301 redirect/seo juice. Or do I wait a week or two, allow Google to re-crawl the site and see the existing 301's and once they've taken notice of the changes submit an updated sitemap? Probably a stupid question I understand, but I want to ensure I'm following the best practices given the situation, thanks guys and girls!
Technical SEO | | Emory_Peterson0 -
Is there a way for me to automatically download a website's sitemap.xml every month?
From now on we want to store all our sitemap.xml over the next years. Its a nice archive to have that allows us to analyse how many pages we have on our website and which ones were removed/redirected. Any suggestions? Thanks
Technical SEO | | DeptAgency0 -
Removing Media from Wordpress
I've run the seomoz on page report and found an interesting issue. I'm using wordpress and it seems that every picture I add to my articles seem to be added as separate pages to the site. I'm having to go to each and every picture and creating a meta tag and description to it. I still get duplicate content issues with the same. On my Disqus system, I get the same pictures added just as a page or article would look like. What can I do to avoid this?
Technical SEO | | emasaa0 -
Host sitemaps on S3?
Hey guys, I run a dynamic web service and I will start building static sitemaps for it pretty soon. The fact that my app lives in a multitude of servers doesn't make it easy to distribute frequently updated static files throughout the servers. My idea was to host the files in AWS S3 and point my robots.txt sitemap directive there. I'll use a sitemap index so, every other sitemap will be hosted on S3 as well. I could dynamically mirror the content from the files in S3 through my app, but that would be a little more resource intensive than just serving the static files from a common place. Any ideas? Thanks!
Technical SEO | | tanlup0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
HTML Sitemap Pagination?
Im creating an a to z type directory of internal pages within a site of mine however there are cases where there are over 500 links within the pages. I intend to use pagination (rel=next/prev) to avoid too many links on the page but am worried about indexation issues. should I be worried?"
Technical SEO | | DMGoo0 -
.%E2%80%9d breaking the URL in wordpress
My wordpress URL is breaking and there are 5000 not found urls in webmaster due to some code being added %E2%80%9d. This code stands for double quotation marks - " Now the question is, where has my site gone wrong? I will tell you the changes which i have made Deleted a Vbulletin forum - Half of the errors are due to the forum being deleted directly Upgraded to Wordpress 3.3 (crawl errors did not show on the same day. Much later) Upgraded to Blue host pro (crawl errors did not show on the same day. Much later) These are some of the speculations. But nonetheless i have no idea why this is happening. To give further hints, the Home page URL is being added to the original URL. http://www.marketing91.com/article/http://www.marketing91.com http://www.marketing91.com/article/http://www.wrodpress.org So these are a list of problems i am facing in URL. Now i have no idea why this is happening. I can account for the deletion of a vbulletin forum. But that accounts only for half of the crawl errors. So any replies or answers??
Technical SEO | | hith2340