Using Site Maps Correctly
-
Hello
I'm looking to submit a sitemap for a post driven site with over 5000 pages.
The site hasn't got a sitemap but it is indexed by google - will submitting a sitemap make a difference at this stage?
Also, most free sitemap tools only go up to 5000 pages, and I'm thinking I would try a sitemap using a free version of the tool before I buy one - If my site is 5500 pages but I only submit a sitemap for 5000 (I have no control of which pages get included in the sitemap) would this have a negative effect for the pages that didn't get included?
Thanks
-
Submitting a sitemap in Webmaster Console is always a good idea at any stage. If your website URLs are crawled and indexed in search engines than there will be no negative impact of it but in the longer run if you add more pages sitemap will defiantly a help.
If you are using CMS like WordPress, Joomla, Zencart or any other they all have extensions and plugins in their directory that will help you generate the sitemap of your current site and will add links as soon as you will add more pages.
Rest peter explains almost everything in detail like if you have URL issues and issues with crawling and indexing.
If you have a custom CMS, I think you should seriously consider the idea by Peter as this is something you need on regular basis anyways!
Hope this helps!
-
It's hard to tell without seeing your URL architecture.
First there are two specific terms and you never, never ever should forget them. They are - crawling and indexing. Once you prepare sitemap and submit there (or include in robots.txt) all bots get some map of your site and start crawling pages based on their crawling budget for your site. In crawling process they MAY find new pages that doesn't include in this map and will crawl them too. Again this is based on your crawling budget.
So when you submit sitemap - bot will get within seconds list of "non-crawled" 5000 pages and will start crawl them. Then he can find missed 500 pages and will crawl them too. Tricky is that when you update sitemap - he can detect quick changes there and start recrawling them again. But for missed 500 pages he can visit you again to check them for changes. And this will be also under your crawling budget. But if pages there isn't changed often - isn't big deal.
So you shouldn't hesitated about negative impact there. Only negative impact can happen if you have some serious URL architecture issues and messy URLs there. Then submitting partial sitemap can obfuscate this issues and some of your URLs to remain non-crawled.
Technically in SearchConsole you can see sitemap statistics like submitted and indexed. In perfect world numbers should be almost equal with little difference. But if you see huge difference between them - then you're in trouble. For example - on some site i have sitemap with submitted 44,950 pages and indexed of them was 29,643. This is pure example site crawling troubles or sitemap troubles. Because 1/3 of all pages isn't indexed at all.
PS: I forgot. You should use own CMS plugin for generating sitemap inside. Even if your CMS was custom made you should write (or hire someone) to create plugin inside. It's near 20-30 lines of write-here-your-favorite-language (PHP/Python/Perl/Ruby) and isn't big deal. This plugin will minimize crawling time from 3rd party sitemap generator tool because CMS already have all information inside and just need to be exported to XML.
-
It would definitely be better to submit a complete sitemap. If your site is built in Wordpress, Joomla, Magento, or many other standard CMS, it should have the ability to generate a full sitemap. Plugins like Yoast or Google Sitemaps help. Just depends on the site.
Otherwise you can probably get any pro SEO or agency to create a full 5500+ sitemap for you for $100 bucks or so. PM me if you need more help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same person on my site, every day, for over 6 months??
I watch my Google analytics real time while I work and for the last 6 months, I've had one person on my site consistently during every Monday to Friday. It's only the home page there go on, it never shows they look anywhere else. But it's starting to concern me a little since it's been going on for so long. Does anyone have an idea what they could be doing? I do my own SEO, there is no one else working on my site. Thanks in advance!
Reporting & Analytics | | Coppell0 -
Is it possible to set up one of the Goal Conversions on Google Analytics for a different site?
We are in the process of a website migration and need to set up the conversions for the new site. What is the most effective way of doing this?
Reporting & Analytics | | Sable_Group0 -
Large event site - how should I structure my URLs?
Hi guys, I'm working on a new website which is consolidating a number of existing event sites into one. The existing sites use a variety of URL structures: www.eventsite1.com/events/event-name www.eventsite2.com/festival-program/event-name www.eventsite3.com/event-name This inconsistency has led to issues with tracking category usage properly in analytics - for instance, with eventsite3.com, events fall within categories (www.eventsite3.com/category-name) but as soon as you drill into an event detail page (www.eventsite3.com/event-name) from the category page, the category is lost to analytics. This is compounded when one event lives within multiple categories, as I can't figure out which category is the most effective for a particular event. I've seen other event sites establish a canonical URL for a primary category, display it in the URL (i.e. www.eventsite4.com/primary-category/event-name) yet still let that event get hit via the secondary categories (www.eventsite4.com/secondary-category/event-name). This way, the categories get passed to analytics without any duplicate content issues (i.e. via the setting of canonicals) Basically, I want to make sure that whatever instruction I give to the devs for the new site re: URL structure is correct from an SEO perspective and analytics perspective. Do I even need to worry about having the category in the URL? Can someone please help me with this? Hope this makes sense Cheers
Reporting & Analytics | | cos20300 -
Site re-crawled?
I've fixed many of my errors, but they're still showing in my dashboard. When will the site be crawled again?
Reporting & Analytics | | sakeith0 -
If I am changing my domain for my website and want to keep using the same Google Analytics account to keep the data from the old domain. How should I proceed?
If I am changing my domain for my website and want to keep using the same Google Analytics account to keep the data from the old domain. How should I proceed? Do I have to start a new Google Analytics account for the new domain? If so how do I keep the old data? Or can I use the same GA account? Thank you.
Reporting & Analytics | | brianhughes1 -
Why did I loose all my product page rankings (e-commerce site)
This friday I noticed that I'd lost pretty much all my product pages in the SERP and also their rankings for the product names. These are products I both have introduced to the market (sweden) and also some that I've been the only one selling. I've analyzed a couple of different ranking-faults. Examples: **"super mario väggdekaler" should rank **http://www.roligaprylar.se/Super-Mario-Vaeggdekaler.html as #1 and has done for several years. Instead this search in my internal search engine ranks #10-#15 with no relevance. www.roligaprylar.se/?q=mario%20v%E4g "jedi morgonrock" should rank www.roligaprylar.se/Jedi-Morgonrock.html as #1 or #2 but instead this url ranks as #12 www.roligaprylar.se/product_detail.php?pid=Jedi-Morgonrock "Charlie sheen bobblehead" (in the swedish serp this should be the most simple term to rank on. previously #1) my internal search engine ranks for #8 with this url <cite>www.roligaprylar.se/?q=Charlie%20Sheen%20Bobblehead</cite>J So I've drawn these conclusions and actions Products that don't rank well longer but still ranks with their alternative non-rewritten url has gotten deep links from affilliates (i track affilliate ids and stuff via this link) and have replaced the original url which is rewritten. Action: Canonical urls for these non-rewritten products to the rewritten version. For example on this product page www.roligaprylar.se/product_detail.php?pid=Jedi-Morgonrock I've placed a canonical for this url www.roligaprylar.se/Jedi-morgonrock.html With the products not ranking at all or when searches in my search engine shows up I suspect some kind of dup content punishment where Google thinks the search result is more important than the product page. Action: All search-pages are now noindex,follow I also increased product name density in terms of keywords on the product page. But I'm still owned and losing tons of money during the holidays (buying adwords at obscene amounts instead hehe). So just wanted to hear with you guys. Are my conclusions and actions correct? What have I missed, what more could I do to reverse this? Thanks Dan
Reporting & Analytics | | nuttinalle0 -
Is the link data from Open Site Explorer in real time or an average?
I just started using Open Site Explorer to track internal and external link data. Is this information given in real time or is it an average over a specified period of time?
Reporting & Analytics | | mequoda0