Sitemaps for landing pages
-
Good morning MOZ Community,
We've been doing some re-vamping recently on our primary sitemap, and it's currently being reindexed by the search engines.
We have also been developing landing pages, both for SEO and SEM. Specifically for SEO, the pages are focused on specific, long-tail search terms for a number of our niche areas of focus. Should I, or do I need to be considering a separate sitemap for these? Everything I have read about sitemaps simply indicates that if a site has over 50 thousand pages or so, then you need to split a sitemap.
Do I need to worry about a sitemap for landing pages? Or simply add them to our primary sitemap? Thanks in advance for your insights and advice.
-
Yes, any URL that has over 50,000 URL's should have a sitemap_index, within that xml sitemap index should have listed the other category specific URL sitemaps. These are best organized in the hierarchy of the website structure to reinforce your schematic URL structure.
-
John,
Good to know – At this point I only have our primary sitemap submitted to Search Console, but I will create and add a secondary sitemap. I don't see us adding a ton of secondary-like sitemaps, you still suggest making a sitemap index of sorts?
-
Absolutely no harm at all. Do you have an index sitemap that you list all the sub-sitemaps from? If not you should do that as well just for sanity of sitemap management.
-
John,
Thanks so much for the reply – So there's no harm in submitting a secondary sitemap, specifically for landing pages? Great to hear and yes, many of the landing pages overlap for both SEO and PPC.
Thanks!
Brendan -
Hi there! Good question.
First, each individual XML sitemap should only have a maximum of 50k URLs in it. At the scale of millions of pages I always recommend splitting out your sitemaps by type so that you can monitor indexation by section of the site.
If I were you I'd create a separate sitemap for landing pages and exclude the PPC landing pages unless those are the same pages you've created for SEO.
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have you ever seen or experienced a page indexed which is actually from a website which is blocked by robots.txt?
Hi all, We use robots file and meta robots tags for blocking website or website pages to block bots from crawling. Mostly robots.txt will be used for website and expect all the pages to not getting indexed. But there is a condition here that any page from website can be indexed by Google even the site is blocked from robots.txt; because crawler may find the page link somewhere on internet as stated here at last paragraph. I wonder if this really the case where some webpages have got indexed. And even we use meta tags at page level; do we need to block from robots.txt file? Can we use both techniques at a time? Thanks
Algorithm Updates | | vtmoz0 -
How to hide our duplicate pages from SERP? Best practice to increase visibility to new pages?
Hi all, We have total 4 pages about same topic and similar keywords. These pages are from our main domain and sub domains too. As the pages from sub domains are years old and been receiving visits from SERP, they stick to 1st position. But we have recently created new pages on our main domain which we are expecting to rank on 1st position. I am planning to hide the sub domain pages from SERP using "Remove URLs" for some days to increase visibility to new pages from main domain. Is this the right and best practice to proceed with? Thanks
Algorithm Updates | | vtmoz0 -
The evolution of Google's 'Quality' filters - Do thin product pages still need noindex?
I'm hoping that Mozzers can weigh in with any recent experiences with eCommerce SEO..... I like to assume (perhaps incorrectly) that Google's 'Quality' filters (formerly known as Panda) have evolved with some intelligence since Panda first launched and started penalising eCommerce sites for having thin product pages. On this basis i'd expect that the filters are now less heavy handed and know that product pages with no or little product description on them are still a quality user experience for people who want to buy that product. Therefore my question is this...
Algorithm Updates | | QubaSEO
Do thin product pages still need noindex given that more often that not they are a quality search result for those using a product specific search query? Has anyone experienced penalty recently (last 12 months) on an ecommerce site because of a high number of thin product pages?0 -
Dealing with Omitted Page
For my most competitive term, the wrong page ranks (and not well either). The landing page I built for it has never shown up for that term except after I include the omitted results. The page that does rank is category page page above it. All that's fine, because neither page was all that great...BUT, I have completely re-written the content for the landing page, got local area pictures, local testimonials and a video. So here's my question: Should I put all that content on the landing page that's been omitted or tweak the page that ranks and put it there? To me it makes the most sense to put the content on the page that has been omitted, but I don't know how google treats pages that have been omitted in the past. Is it going to have some sort of bias against the page, because it was omitted so many times earlier for that keyword? Or, will it be treated just like any other page, and if the content is good enough, then it will rank just fine. If anyone's dealt with this, then I'd love to hear all about it! Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Impact of recent On Page Optimisation changes had negative impact !
Hi I recently updated some page titles, H1 tags & on page content which overall has seen search results slip down following the first site crawl by google I assume. My question is, should I try to get back the rankings and test and change one thing at a time to see the impact right now or should i wait for a period of time for it to settle down once goggle has crawled the site a few times or will the subsequent crawls have no impact? Thanks Ash
Algorithm Updates | | AshShep10 -
Help on Page Load Time
I'm trying to track page load time of the visits on my site and GA only says to me that it's equal to zero and page load sample is aways zero too. I've made a research, and I found that GA is used to track page load time automatically, isn't it?
Algorithm Updates | | ivan.precisodisso0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
Why would my product pages no longer be indexed in Google?
Our UK site has 72 pages in our sitemap. 30 of them are product pages which take a productid parameter. Prior to 1st Feb 2011, all pages were indexed in Google but since then all of our product pages seem to have dropped from the index? If I check in webmaster tools, I can see that we have submitted 72 pages and 42 are indexed. I realise we should have some better url structuring and I'm working on that but do you have any ideas on how we can get our product poages back into googles index http://www.ebacdirect.com
Algorithm Updates | | ebacltd0