Using Site Maps Correctly
-
Hello
I'm looking to submit a sitemap for a post driven site with over 5000 pages.
The site hasn't got a sitemap but it is indexed by google - will submitting a sitemap make a difference at this stage?
Also, most free sitemap tools only go up to 5000 pages, and I'm thinking I would try a sitemap using a free version of the tool before I buy one - If my site is 5500 pages but I only submit a sitemap for 5000 (I have no control of which pages get included in the sitemap) would this have a negative effect for the pages that didn't get included?
Thanks
-
Submitting a sitemap in Webmaster Console is always a good idea at any stage. If your website URLs are crawled and indexed in search engines than there will be no negative impact of it but in the longer run if you add more pages sitemap will defiantly a help.
If you are using CMS like WordPress, Joomla, Zencart or any other they all have extensions and plugins in their directory that will help you generate the sitemap of your current site and will add links as soon as you will add more pages.
Rest peter explains almost everything in detail like if you have URL issues and issues with crawling and indexing.
If you have a custom CMS, I think you should seriously consider the idea by Peter as this is something you need on regular basis anyways!
Hope this helps!
-
It's hard to tell without seeing your URL architecture.
First there are two specific terms and you never, never ever should forget them. They are - crawling and indexing. Once you prepare sitemap and submit there (or include in robots.txt) all bots get some map of your site and start crawling pages based on their crawling budget for your site. In crawling process they MAY find new pages that doesn't include in this map and will crawl them too. Again this is based on your crawling budget.
So when you submit sitemap - bot will get within seconds list of "non-crawled" 5000 pages and will start crawl them. Then he can find missed 500 pages and will crawl them too. Tricky is that when you update sitemap - he can detect quick changes there and start recrawling them again. But for missed 500 pages he can visit you again to check them for changes. And this will be also under your crawling budget. But if pages there isn't changed often - isn't big deal.
So you shouldn't hesitated about negative impact there. Only negative impact can happen if you have some serious URL architecture issues and messy URLs there. Then submitting partial sitemap can obfuscate this issues and some of your URLs to remain non-crawled.
Technically in SearchConsole you can see sitemap statistics like submitted and indexed. In perfect world numbers should be almost equal with little difference. But if you see huge difference between them - then you're in trouble. For example - on some site i have sitemap with submitted 44,950 pages and indexed of them was 29,643. This is pure example site crawling troubles or sitemap troubles. Because 1/3 of all pages isn't indexed at all.
PS: I forgot. You should use own CMS plugin for generating sitemap inside. Even if your CMS was custom made you should write (or hire someone) to create plugin inside. It's near 20-30 lines of write-here-your-favorite-language (PHP/Python/Perl/Ruby) and isn't big deal. This plugin will minimize crawling time from 3rd party sitemap generator tool because CMS already have all information inside and just need to be exported to XML.
-
It would definitely be better to submit a complete sitemap. If your site is built in Wordpress, Joomla, Magento, or many other standard CMS, it should have the ability to generate a full sitemap. Plugins like Yoast or Google Sitemaps help. Just depends on the site.
Otherwise you can probably get any pro SEO or agency to create a full 5500+ sitemap for you for $100 bucks or so. PM me if you need more help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
High bounce rates consistent with a login that takes you to a 3rd party site?
My firm has a credit union client whose bounce rates skyrocketed after implementing an online banking portal. Logging in to the online banking portal takes you to a 3rd party site. Would arriving at the site and immediately logging in be considered a bounce? And if so, would a high bounce rate actually correlate with a warm reception to their online banking tool?
Reporting & Analytics | | TheKatzMeow0 -
Differences in site search revenue in GA
I just put in a piece of software to replace a really bad built in site search engine on my 3dcart website. Now I am trying to measure the change, but I am having some issues. When I check the ecom data in the conversions section of GA with the built in segment Performed Site Search, I get promising results. Approximately 5% revenue increase over LY. But if we jump to behavior, site search, usage, and then check the visits with site search, I get a decrease by 4%. And the actual revenue is off, by like double (150k compared to 80k) Anyone have any idea why I am getting these results? The site search function is set up. Tracking is enabled, query parameter is keyword and search url is /search.asp?keyword=
Reporting & Analytics | | ShockoeCommerce0 -
How to transfer 5 domains under one Brand domain without using 301 redirect with minimum SEO loss
My situation is rather complicated , I have 5 domains with different but identical services. Some with good traffic some without, but 2 main domains have manual penalty from Google (I have tried a lot to get rid of the penalty but no success) so finally I have decided to move all our websites under one anchor website 'a brand' and pass all the seo juice + blog articles under the same. Now the problem with the procedure is that we cannot redirect the penalized domain urls to new one as penalization will also be transferred and the whole motive of creating a new brand will be lost. So if somebody could guide me as of how I should proceed transferring SEO value with minimum traffic loss but without using 301) **One way I could figure out is using canonical (Am not confident over it) but say I create 5 identical pages on new domain and declare them as canonical of old domain pages (So is it safe? will it transfer the penalty of old domain to canonical new urls? ) ** Rest i am bifurcating all the traffic sources like direct / organic / referral / Social etc and chalking out what we can control manually and in what ratio, gradually I will work on each section to transfer the traffic. Main Problem is of Organic and Not available. Some suggestions or blog urls would be appreciated.
Reporting & Analytics | | ngupta10 -
May last year my sites orgainic listings, and therfore visitors, plumeted. Why?
Hello, May last year my site took a major fall. I am unsure why, and it's time I found out why. Recently I have rebuilt the site from scratch, except for the urls and content, and it's starting to turn back. What is the best method to go about understanding just what caused the decline? What are the options I have? See the image for a graph of the all-time traffic. http://i.imgur.com/uL93yPj.png My website in question is: www.ditalia.com.au Thanks. uL93yPj.png
Reporting & Analytics | | infinart0 -
Anyone notice a drop in results using site operator?
I set our site's preferred domain back on January 28. We had a www and non www domain being indexed. Since then, I've seen the number or results for our site site operator (site:) decline dramatically. Not sure if this is a good thing or bad thing. So, I'm trying to see if it's unique to our site. My gut is that the numbers are probably leveling out to where they should be and the duplicates are falling out, but I would think that as I see number of results for non www decline, the number of results for www would increase. Any thoughts? Anyone else seeing fluctuations in results using site: ? Lisa
Reporting & Analytics | | Aggie0 -
Using Regex for Goals in Google Analytics
I’ve got a website with 20 different forms; and I’d like to track all 20 form completions with one Goal in Google Analytics. The completion-page URLs follow the pattern below (the website uses Drupal CMS) Form One: www.acme.com/node/2990/done?sid=651 Form Two: www.acme.com /node/2991/done?sid=785 Form Three: www.acme.com /node/2992/done?sid=1021 Form Four: www.acme.com /node/2993/done?sid=459 I believe that there is a way I can use regex so Google Analytics will track if any of these completion URLs appears. Looking for guidance how to format the regex statement.
Reporting & Analytics | | TopFloor0 -
Does prevent links from being included in Google Webmaster linking sites report?
My client has clean links in edit from nytimes.com. The links do not have nofollow tags. Google Webmaster stopped including links from nytimes.com in the external linking domains report and we don't know why since the URL is still live. The nytimes.com URL includes this tag in the source code: Are links on pages with NOARCHIVE still counted in Google Webmaster linking domains reports?
Reporting & Analytics | | ebenthurston0 -
Should you get a new Google Analytics account if your site has a new domain after a site redesign/new development?
We recently developed a new site for a client and they have opted to move forward with a domain change. Should we create a new Google Analytics account for the new site?
Reporting & Analytics | | TheOceanAgency0