Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Resubmit sitemaps on every change?
-
Hello Mozers,
Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap.
My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume).
What are best practices here?
Thanks!
-
Great follow up! Thanks for that. :^)
-
For anybody that is interested in knowing, after 10 days of webmaster monitoring, here are our conclusions:
We have 2 sitemaps (2 languages). It took a few days for Google to re-download the mains sitemap, and a few more days to download the secondary sitemap, however, the new sitemaps were both picked up by Google with no intervention on our part.
Bing, however, has still no downloaded either of the updated sitemaps.
Because it's so easy and isn't seen as bad practice, we will be manually re-submitting sitemap updated to both search engines.
Thanks!
-
Ryan gave an excellent answer. Google is using other clues to pick up on new pages I'm not saying don't submit your sitemap I'm just saying add structured data/schema into your store as well. Check your crawl budget and see if your site is eating up too much of it and not being indexed properly by Google.
A simple test to see if something is being blocked is to run your site through https://varvy.com/
If you do not know, I would stress using Deep Crawl or screaming frog SEO spider
Navigation Is often one of many can cause problems where your site will not be crawled correctly.
To determine whether or not you have to crawl budget issue we got each can make independent image sitemaps and index sitemaps as well just to be sure that Google is getting what you want to.
Like Ryan said check out
https://support.google.com/webmasters/answer/183669
Here is a Magento dynamic site map http://i.imgur.com/QKS0bgU.png
validate your sitemap check it for problems
http://tools.seochat.com/tools/site-validator/
https://moz.com/learn/seo/schema-structured-data
http://www.searchmetrics.com/news-and-events/schema-org-in-google-search-results/
https://blog.kissmetrics.com/seo-for-marketplaces-ecommerce/
JSON-LD Microdata
https://builtvisible.com/micro-data-schema-org-guide-generating-rich-snippets/#json
I hope this helps,
Thomas
-
Hello. You can check the Submitted vs Indexed count within Search Console to see whether or not your regenerated sitemap is being picked up already, but resubmitting a sitemap isn't an issue, and fairly easy to do, per Google:
Resubmit your sitemap
- Open the Sitemaps report
- Select the sitemap(s) you want to resubmit from the table
- Click the Resubmit sitemap button.
You can also resubmit a sitemap by sending an HTTP GET request to the following URL, specifying your own sitemap URL: http://google.com/ping?sitemap=http://www.example.com/my_sitemap.xml
Via: https://support.google.com/webmasters/answer/183669 Also from a FAQ in the Webmasters blog they state that, "Google does not penalize you for submitting a Sitemap."
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Upgrade old sitemap to a new sitemap index. How to do without danger ?
Hi MOZ users and friends. I have a website that have a php template developed by ourselves, and a wordpress blog in /blog/ subdirectory. Actually we have a sitemap.xml file in the root domain where are all the subsections and blog's posts. We upgrade manually the sitemap, once a month, adding the new posts created in the blog. I want to automate this process , so i created a sitemap index with two sitemaps inside it. One is the old sitemap without the blog's posts and a new one created with "Google XML Sitemap" wordpress plugin, inside the /blog/ subdirectory. That is, in the sitemap_index.xml file i have: Domain.com/sitemap.xml (old sitemap after remove blog posts urls) Domain.com/blog/sitemap.xml (auto-updatable sitemap create with Google XML plugin) Now i have to submit this sitemap index to Google Search Console, but i want to be completely sure about how to do this. I think that the only that i have to do is delete the old sitemap on Search Console and upload the new sitemap index, is it ok ?
Technical SEO | | ClaudioHeilborn0 -
Automate XML Sitemaps
Quick question, which is the best method that people have for automating sitemaps. We publish around 200 times a day and I would like to make sure as soon as we publish it gets updated in the site map. What is the best method of updating a sitemap so it gets updated immediately after it is published.
Technical SEO | | mattdinbrooklyn0 -
Adding multi-language sitemaps to robots.txt
I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?
Technical SEO | | MickEdwards0 -
Should I include tags in sitemap?
Hello All, I was wondering if you should include tags and categories in your sitemap. In the past on previous blogs I have always left tags and categories out. The reason for this is a good friend of mine who has been doing SEO for a long time and inhouse always told me that this would result in duplicate content. I thought that it would be a great idea to get some input from the SEOmoz community as this obviously has a big affect on your blog and the number of pages indexed. Any help would be great. Thanks, Luke Hutchinson.
Technical SEO | | LukeHutchinson1 -
HTML Sitemap Pagination?
Im creating an a to z type directory of internal pages within a site of mine however there are cases where there are over 500 links within the pages. I intend to use pagination (rel=next/prev) to avoid too many links on the page but am worried about indexation issues. should I be worried?"
Technical SEO | | DMGoo0 -
How much to change to avoid duplicate content?
Working on a site for a dentist. They have a long list of services that they want us to flesh out with text. They provided a bullet list of services, we're trying to get 1 to 2 paragraphs of text for each. Obviously, we're not going to write this off the top of our heads. We're pulling text from other sources and trying to rework. The question is, how much rephrasing do we have to do to avoid a duplicate content penalty? Do we make sure there are changes per paragraph, sentence, or phrase? Thanks! Eric
Technical SEO | | ericmccarty0 -
Changing CMS, are there SEO effects?
We want to change our cms from typo3 to CMS made Simple. We have done this already for another site and it effected the rankings. Have you got experience with this? What factors are important for SEO to consider? Is it normal when you change from cms the rankings will drop?
Technical SEO | | PlusPort0 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0