Automating Cache Warmup after Deployments with Bash Script

- 1 min read

Deploying changes to a production environment can sometimes lead to cold caches, resulting in slower response times for your users and, even worse, for the Google crawler. To mitigate this issue, automating the cache warmup process can be immensely beneficial.

Let me show you the Bash script I use to warm up the cache of the sites I deploy.

The Script

#!/usr/bin/env bash

function display_help {
  echo "Usage: ${0} <domain>"
  echo "  Example: ${0}"
  echo ""
  echo "Options:"
  echo "  --help       Display this help message"
  exit 1

# Check for --help option or missing parameter
if [ "${1}" == "--help" ] || [ -z "${1}" ]; then


if curl --fail --silent --output crawling_sitemap.xml "https://${domain}/sitemap.xml"; then
  echo "Sitemap downloaded successfully"
  echo "Sitemap download failed"
  exit 1

cat "crawling_sitemap.xml" | perl -ne 'while (/>(http.+?)</g) { print "$1\n"; }' | while read -r line; do
  curl -so /dev/null -w "connect:%{time_connect} | starttransfer:%{time_starttransfer} | total:%{time_total} | %{url}\n" "${line}"

echo "Crawling complete for ${domain}"

rm crawling_sitemap.xml

How It Works

This script is very simple. It basically downloads the sitemap.xml under the root of the given domain. Then, just loops over the URLs found in it and sends a basic GET request.


To use the script, simply execute it with your domain as an argument:



Automating the cache warmup process is important for maintaining optimal performance after deploying changes to your production environment. This Bash script ensures that your cache is preloaded with the latest content.

Happy caching!

Share: Link copied to clipboard


Previous: Publishing a self-hosted Simple OTS solution
Next: Building a Theme Switcher for Bootstrap 5.3+

Where: Home > Technical > Automating Cache Warmup after Deployments with Bash Script