Google’s John Mueller was asked if it is helps after doing a big URL migration to keep the old XML Sitemap file up for a bit. John said “the effect would be minimal” by using this strategy.
The theory is that if you leave the old XML sitemap live for a bit, Google will look to it to crawl those URLs faster and then pick up on the new URLs (the ones the old ones redirect to) faster. But John said it doesn’t really work that way, he said “if you change all the dates on existing pages, that doesn’t give us much insight into where to start crawling, so I suspect overall it would crawl as usual.” So normal crawling will be done on those old URLs and Google will pick up on the redirects in its normal crawl – the XML sitemap might not have an impact on that, at least a noticeable impact.
Here are those tweets:
@JohnMu Do you recommend keeping the old xml sitemap live after a large URL structure migration, so that bots still have access to them and find the new URLs quicker via redirects?
— Daniel Lira (@DanielTexasLira) November 16, 2021
Temporarily is fine, but I suspect the effect would be minimal (if you change all the dates on existing pages, that doesn’t give us much insight into where to start crawling, so I suspect overall it would crawl as usual)
— 🧀 John 🧀 (@JohnMu) November 17, 2021
It is a good question because Google, including John Mueller, has said in the past that you can use Sitemaps to speed up page removals on some level and it seems Gary Illyes from Google said this might also work with URL migrations. So the question has logical sense but the response also does.
Forum discussion at Twitter.