Multiple Sitemaps Vs One Sitemap and Why 500 URLs?
-
I have a large website with rental listings in 14 markets, listings are added and taken off weekly if not daily. There are hundreds of listings in each market and all have their own landing page with a few pages associated. What is the best process here? I could run one sitemap and make each market's landing page .8 priority in the sitemap or make 14 sitemaps for each market and then have one sitemap for the general and static pages.
From there, what would be the better way to structure? Should I keep all the big main landing pages in the general static sitemap or have them be at the top of the market segmented sitemaps?
Also, I have over 5,000 urls, what is the best way to generate a sitemap over 500 urls? Is it necessary?
-
Hi, Dom, maybe you find something here: http://moz.com/community/q/does-anyone-know-of-any-tools-that-can-help-split-up-xml-sitemap-to-make-it-more-efficient-and-better-for-seo
Splitting a sitemap can be useful for you to keep sight over the indexed url's. But it's quit normal to run a sitemap with 5000 url's. the maximum is 50.000 url's for 1 sitemap, so it's not necessary.
Grtz, Leonie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We 410'ed URLs to decrease URLs submitted and increase crawl rate, but dynamically generated sub URLs from pagination are showing as 404s. Should we 410 these sub URLs?
Hi everyone! We recently 410'ed some URLs to decrease the URLs submitted and hopefully increase our crawl rate. We had some dynamically generated sub-URLs for pagination that are shown as 404s in google. These sub-URLs were canonical to the main URLs and not included in our sitemap. Ex: We assumed that if we 410'ed example.com/url, then the dynamically generated example.com/url/page1 would also 410, but instead it 404’ed. Does it make sense to go through and 410 these dynamically generated sub-URLs or is it not worth it? Thanks in advice for your help! Jeff
Intermediate & Advanced SEO | | jeffchen0 -
CcTLDs vs folders
My company is looking at expanding internationally, we have sudomains in the UK and Canada currently. I'm making recommendations on improving SEO and one of the parts that I'm struggling with is the benefits of ccTLDs vs using folders. I know the basic argument about Google recognizing the ccTLDs as being geo specific so they get priority. But I'd like to know HOW much priority they get. We have unique keywords and a pretty strong domain, is having a ccTLDs so much better that'd be worth going that route rather then creating folders within our current domain? Thanks, Jacob
Intermediate & Advanced SEO | | jacob.young.cricut0 -
Sitemap generator which only includes canonical urls
Does anyone know of a 3rd party sitemap generator that will only include the canonical url's? Creating a sitemap with geo and sorting based parameters isn't the most ideal way to generate sitemaps. Please let me know if anyone has any ideas. Mind you we have hundreds of thousands of indexed url's and this can't be done with a simple text editor.
Intermediate & Advanced SEO | | recbrands0 -
XML Sitemap index within a XML sitemaps index
We have a similar problem to http://www.seomoz.org/q/can-a-xml-sitemap-index-point-to-other-sitemaps-indexes Can a XML sitemap index point to other sitemaps indexes? According to the "Unique Doll Clothing" example on this link, it seems possible http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic Can someone share an XML Sitemap index within a XML sitemaps index example? We are looking for the format to implement the same on our website.
Intermediate & Advanced SEO | | Lakshdeep0 -
Should I change wordpress urls?
Should I change my wordpress permalinks to include the keyword? For examples at the minute my url is http://www.musicliveuk.com/home/wedding-singer. Is it better to be http://www.musicliveuk.com/live-bands/wedding-singer. 'home' is not relevant so surely 'live-bands' would be better? If I change the urls won't I lose 'link juice' as external links will all point to a url that no longer exists? Or will wordpress automatically redirect the old url to the new one? Finally, if I should change the url as described how do I do it on wordpress? I can only see how to edit the last bit of the url and not the middle bit.
Intermediate & Advanced SEO | | SamCUK0 -
Renaming a URL
Hi, If we rename a URL (below) http://www.opentext.com/2/global/company/company-ecm-positioning.htm
Intermediate & Advanced SEO | | pstables
to http://www.opentext.com/2/global/products/enterprise-content-management.htm (or something similar) Would search engines recognize that as a new page altogether? I know they would need to reindex it accordingly, so in theory it is kind of a "new" page. But the reason for doing this is to maintain the page's metrics (inbound links, authority, social activity, etc) instead of creating a new page from scratch. The page has been indexed highly in the past, so we want to keep it active but optimize it better and redirect other internal content (that's being phased out) to it to juice it up even more. Thanks in advance!
Greg0 -
How important is it to clarify URL parameters?
We have a long list of URL parameters in our Google Webmasters account. Currently, the majority are set to 'let googlebot decide.' How important is it to specify exactly what googlebot should do? Would you leave these to 'let googlebot decide' or would you specify how googlebot should treat each parameter?
Intermediate & Advanced SEO | | nicole.healthline0 -
Rewriting dynamic urls to static
We're currently working on an SEO project for http://www.gear-zone.co.uk/. After a crawl of their site, tons of duplicate content issues came up. We think this is largely down to the use of their brand filtering system, which works like this: By clicking on a brand, the site generates a url with the brand keywords in, for example: http://www.gear-zone.co.uk/3-season-synthetic-cid77.html filtered by the brand Mammut becomes: http://www.gear-zone.co.uk/3-season-synthetic-Mammut-cid77.html?filter_brand=48 This was done by a previous SEO agency in order to prevent duplicate content. We suspect that this has made the issue worse though, as by removing the dynamic string from the end of the URL, the same content is displayed as the unfiltered page. For example http://www.gear-zone.co.uk/3-season-synthetic-Mammut-cid77.html shows the same content as: http://www.gear-zone.co.uk/3-season-synthetic-cid77.html Now, if we're right in thinking that Google is unlikely to the crawl the dynamic filter, this would seem to be the root of the duplicate issue. If this is the case, would rewriting the dynamic URLs to static on the server side be the best fix? It's a Windows Server/asp site. I hope that's clear! It's a pretty tricky issue and it would be good to know your thoughts. Thanks!
Intermediate & Advanced SEO | | neooptic0