Sitemaps during a migration - which is the best way of dealing with them?
-
Many SEOs I know simply upload the new sitemap once the new site is launched - some keep the old site's URLs on the new sitemap (for a while) to facilitate the migration - others upload both the old and the new website together, to support the migration. Which is the best way to proceed? Thanks, Luke
-
Very much appreciated CleverPhD!
-
Found this while looking for a answer for another question could not find this the other day- right from the mouth of Google to not include pages that do not exist in XML sitemaps.
http://googlewebmastercentral.blogspot.com/2014/10/best-practices-for-xml-sitemaps-rssatom.html
URLs
URLs in XML sitemaps and RSS/Atom feeds should adhere to the following guidelines:
- Only include URLs that can be fetched by Googlebot. A common mistake is including URLs disallowed by robots.txt — which cannot be fetched by Googlebot, or including URLs of pages that don't exist.
-
Mate nailed it completely!
-
I would say make sure that your new sitemap has all the latest URLs. The reason people say that you should have old URLs in the sitemap is so that Google can quickly crawl the old URLs to find the 301s to the new URLs.
I am not convinced that this helps. Why?
Google already has all your old URLs in its systems. You would be shocked how far back Google has data on your site with old URLs. I have a site that is over 10 years old and I still see URL structures referenced in Google from 7 years ago that have a 301 in place. Why is this?
Google will assume that, "Well, I know that this URL is a 301 or 404, but I am going to crawl it every once in a while just to make sure the webmaster did not do this by mistake." You can notice this in Search Console error or link reports when you setup 301s or 404s, they may stay in there for months and even come back once they fall out of the error list. I had an occurrence where I had some old URLs showing up in the SERPs and various Search Console reports for a site for 2 years following proper 301 setups. Why was this happening?
This is a large site and we still had some old content still linking to the old URLs. The solution was to delete the links in that old content and setup a canonical to self on all the pages to help give a definitive directive to Google. Google then finally replaced the old URLs with the new URLs in the SERPs and in the Search Console reports. The point here being that previously our site was giving signals (links) that told Google that some of the old URLs were still valid and Google was giving us the benefit of the doubt.
If you want to have the new URLs seen by Google, show them in your sitemap. Google already has all the old URLs and will check them and find the 301s and fix everything. I would also recommend the canonical to self on the new pages. Don't give any signals to Google that your old URLs are still valid by linking to them in any way, especially your sitemap. I would even go so far as to reach out to any important sites that link to old URLs to ask for an updated link to your site.
As I mentioned above, I do not think there is an "advantage" of getting the new URLs indexed quicker by putting old URLs in the sitemap that 301 to the new URLs. Just watch your Google Search Console crawl stats. Once you do a major overhaul, you will see Google really crawl your site like crazy and they will update things pretty quick. Putting the old URLs in the sitemap is a conflicting signal in that process and has the potential to slow Google down IMHO.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats the best practice for internal links?
Hi our site is set up typically for a key product (money page) with 6 to 12 cluster pages, with a few more associated blog pages. If for example the key product was "funeral plans" what percentage of the internal anchor text links should be an exact match? Will the prominence of those links eg higher up the page have an impact on the amount of juice flowing? And do links in buttons count in the same way as on page anchor text eg "compare funeral plans"? Many thanks
Intermediate & Advanced SEO | | AshShep1
Ash1 -
Existing 301s during site migration - what to do?
Hi - I'm looking at an old website and there are lots of 301s internal to that site - what do I do with these when I move to a new site? Should I list them and adjust them so they redirect to the new site now (instead of from one URL to another URL on the old site) - I'm thinking that if I don't the user will have to travel through one 301 then another to get to the new site, which doesn't seem like a great idea? Your thoughts would be welcome.
Intermediate & Advanced SEO | | McTaggart0 -
Whats the best way to set up a directory listing website
Hello all, I am building a website that lists homeschool events and field trips across various states (locker-time.com) and I have a few questions on setting it up correctly. Both the events and field trips are searchable by distance. For clarification, events are associated with a specific date and time and field trips are not. I currently have a link that says homeschool events and you enter your zip to find things close by. Is it better to create a separate page for each state I am targeting instead? So the link would be homeschool events and then a sub-link that says homeschool events in GA and the GA page brings up all the events in GA, still searchable by zip. Or does it matter? I was thinking if its a separate page, I could put keyword rich copy on top, but then clicking on the menu and choosing the appropriate sub-menu is an additional step for users on the site and as the number of states increase, that sub-menu could get pretty big. The search results pages lists the post title of any events or field trips found and the links go to a page on my website with more information, such as the location, details on the event / field trip and a link to their website. I am wondering for SEO purposes, is this the right way to do it? Or I could set up the results page to show an excerpt and some listing info and then link directly to their website. Does it matter? I was thinking a page on my own website since then I could add images (but that might end up sucking up all my hosting space). As I am adding these listings to my website, I simply copied/pasted the details on the event. Now that I'm thinking about it, original content is best, so should I stop doing that and rewrite the description in my own words? Since the events are date specific events and when they pass, they are no longer on the site, does it matter as much for the events? The field trips do not have dates associated with them, so I can probably work on creating my own descriptions for those. Just not sure if I should bother with events that are more short term. Thanks in advance for ANY advice or suggestions. I'm so looking forward to getting this all set up correctly! I find working on this SEO stuff such fun! Jeanette
Intermediate & Advanced SEO | | fatcreat0 -
Proper sitemap update frequency
I have 12 sitemaps submitted to Google. After about a week, Google is about 50% of the way through crawling each one. In the past week I've created many more pages. Should I wait until Google is 100% complete with my original sitemaps or can I just go ahead and refresh them? When I refresh the original files will have different URLs.
Intermediate & Advanced SEO | | jcgoodrich0 -
Why do some sites have several types of sitemap?
Hello Mozzers, I often seem to work on websites with several types of sitemaps - e.g. an html sitemap - an xml sitemap - almost always with identical structure and content. Does anybody know the thinking behind this? Currently looking at site with php and xml sitemap sitting alongside one another. I'm guessing one is for site users to read (and also to aid indexing) and the other for search engines, to further aid indexing. Does Google have any preferences? Is there anything you should be wary of re: Google, if there are multiple sitemaps?
Intermediate & Advanced SEO | | McTaggart0 -
Best way to SEO crowdsourcing site
What is the best way to SEO a crowdsourcing site? The websites content is entirely propagated by the user
Intermediate & Advanced SEO | | StreetwiseReports0 -
Tool to check XML sitemap
Hello, Can anyone help me finding a tool to have closer look of the XML sitemap? Tks in advance! PP
Intermediate & Advanced SEO | | PedroM0 -
Best way to geo redirect
Hi I have a couple of ecommerce websites which have both a UK and USA store. At the moment I have both the UK and the USA domains sending me traffic from UK and USA search engines which means that a number of users are clicking a Google page for the store not in their location, ie UK people are clicking on a .com listing and ending up on the USA website. What is the best way to automatically redirect people to the correct store for their region? If I use an IP based auto redirect system would Google see some of the pages are doorway pages? Thanks
Intermediate & Advanced SEO | | Grumpy_Carl0