Should XML sitemaps include *all* pages or just the deeper ones?
-
Hi guys,
Ok this is a bit of a sitemap 101 question but I cant find a definitive answer:
When we're running out XML sitemaps for google to chew on (we're talking ecommerce and directory sites with many pages inside sub-categories here) is there any point in mentioning the homepage or even the second level pages? We know google is crawling and indexing those and we're thinking we should trim the fat and just send a map of the bottom level pages.
What do you think?
-
It is correct that DA, PA, depth of pages, etc. are all factors in determining which pages get indexed. If your site offers good navigation, reasonable backlinks, anchor text, etc then you can get close to all pages indexed even on a very large site.
Your site map should naturally include a date on every link which indicates when content was added or changed. Even if you submit a 10k list of links, Google can evaluate the dates on each link and determine which content has been added or modified since your site was last crawled.
-
Well yes, that's kinda my point. We do have a sensible, crawlable navigation so there will be no problems there, so then the sitemap really becomes an indicator of what needs to be crawled (new and updated pages), but then the same question stands...
With other sites we've managed with thousands of pages we've found it detrimental to give Google hundreds of pages to crawl on a sitemap that we don't feel are important. We're pretty sure (and SEOmoz staff have supported this) that domain authority and the number of pages you can get into the index are closely related.
-
Tim,
We always index ALL pages...the help tip on Google XML also suggests including all pages of your site in the XML sitemap.
-
Your sitemap should include every page of your site that you wish to be indexed.
The idea is that if your site does not provide crawlable navigation, Google can use your sitemap to crawl your site. There are some sites that use flash and when a crawler lands on a page there is absolutely no where for the crawler to go.
If your site navigation is solid then a sitemap doesn't offer any value to Google other then an indicator of when content is updated or added.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Two websites, one company, one physical address - how to make the best of it in terms of local visibility?
Hello! I have one company which will be operating in two markets, printing and website design / development. I’m planning on building two websites, each for every market. But I’m a bit confused about how to optimize these websites locally. My thought is to use my physical address for one website (build citations, get listed in directories, etc. ) and PO Box for another. Do you think there is a better idea?
Technical SEO | | VELV1 -
Why did our highly ranked keyword drop to 51+ for just one week?
One of our most important keywords (ranked 5) dropped to 51+ one week and then went back to 5 around the time we launched a new site. Why did that happen?
Technical SEO | | virtuance_photography0 -
Which product URL to include in Sitemaps?
Hi Does the product URL's in Sitemaps affect the sub-categories authority too? For example, if I have a product with 2 URL's and which have a canonical tag: **/brands/michael-kors/bags/**jet-set-double-zip-wallet/ **/women/accessories/wallets/**jet-set-double-zip-wallet/ If I make the main URL "/women/accessories/wallets/jet-set-double-zip-wallet/" and set that as the Canonical URL & list that URL in the XML Sitemap, will it also mean the "/women/accessories/wallets/" category will get more authority and increase it's power to rank? Thanks Frankie
Technical SEO | | Frankie-BTDublin0 -
How do I prevent duplicate page title errors from being generated by my multiple shop pages?
Our e-commerce shop has numerous pages within the main shop page. Users navigate through the shop via typical pagination. So while there may be 6 pages of products it's all still under the main shop page. Moz keeps flagging my shop pages as having duplicate titles (ie shop page 2). But they're all the same page. Users aren't loading unique pages each time they go to the next page of products and they aren't pages I can edit. I'm not sure how to prevent this issue from popping up on my reports.
Technical SEO | | NiteSkirm0 -
Should I deindex my pages?
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place. Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
Technical SEO | | maxweb0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Google ranks my sitemap.xml instead of blog post
Hello, For some reason Google shows sitemap results when i search for my blog url website.com/blog/postwhy is Google ranking my sitemap but not a post, especially when i search for full URL? Thanks
Technical SEO | | KentR0 -
Website ranking went from page one to not in top 50 overnight. Help/suggestions?
One of our customer's websites initially ranked very well. For approximately 3 months it sat atop of Google for their optimized keywords. Suddenly, on November 17th, the ranking dropped and they were no longer in the top 50 for any keywords. We went through Google Webmaster tools and found no violations, so we emailed Google to see if we violated something and if they would reconsider. They responded "We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google." This is a site built on WordPress, so we turned off a majority of plugins in case one was somehow affecting the site. They have an incredible amount of business partners that link their website from their partner's website menus, so they have about 15,000 links all with anchor text "insurance." (every page on partner site is seen as a different link). Think this is affecting it? Maybe Google sees it as artificial? (P.S. This has been set up this way for a while before they came on with us). The site ranks on page one of Bing and Yahoo, but nowhere in top 50 for Google. Any suggestions? Appreciate the help!
Technical SEO | | Tosten0