Backup Recipes

Real-world backup recipes you can use today

These are practical, copy-paste-ready backup workflows for files, directories, databases, and more. Each recipe is designed to help you create a backup, verify it, download it later, and decrypt it locally.

What every recipe includes

Each recipe page is more than a command snippet. It walks through the full lifecycle: why the recipe works, how to run it, what could go wrong, how to download the backup later, and how to decrypt it locally.

Backup steps

Copy-paste-ready commands and config examples.

Verification

Clear guidance on what success looks like and why verification matters.

Download

How to retrieve the encrypted backup from your portal later.

Decrypt & restore

Local decrypt steps so only you can recover the original data.

One rule: write to STDOUT

Your command must write backup data to STDOUT. The Backup Verified agent reads from STDOUT, encrypts locally, and uploads the encrypted backup to Managed Storage.

  • tar should use -cf - or -czf -. The final - means “write to STDOUT.”
  • mysqldump and pg_dump write to STDOUT by default.
  • Avoid redirecting backup output to a local file unless you intentionally want a separate local artifact.

Common YAML gotcha

In some config examples, you may see backup_command: >. That > is YAML formatting for multi-line text. It is not shell output redirection.

In most cases, the best pattern is simple: keep the backup data on STDOUT so the agent can read it, encrypt it locally, and upload the encrypted result.

Recipe library

Start with the workflow that matches your environment. Each recipe page includes the backup command, what could go wrong, verification guidance, and download/decrypt steps.

Common command patterns

These examples are useful reference points, but the full recipe pages above provide the complete workflow.

MySQL (logical dump)

A safe default for many MySQL workloads.

mysqldump \
  --single-transaction \
  --quick \
  --skip-lock-tables \
  --no-tablespaces \
  --routines --triggers --events \
  --set-gtid-purged=OFF \
  -u USER \
  DB_NAME

PostgreSQL (logical dump)

A solid starting point using custom format.

pg_dump \
  -Fc \
  -Z 6 \
  -U USER \
  -d DB_NAME

Files & directories (tar)

Useful for files, uploads, and config snapshots.

tar -cf - \
  --exclude='./tmp' \
  --exclude='./cache' \
  /var/www /etc

The trailing - after -cf is required. It tells tar to write the archive to STDOUT.

Docker volume snapshot

Snapshot a named volume as a tar stream.

docker run --rm \
  -v my_volume:/data:ro \
  alpine \
  tar -cf - -C /data .

Putting a recipe into a BV config

Once you choose a recipe, place the command into source.backup_command. The agent runs the command, encrypts locally, and uploads the encrypted result to Managed Storage.

# bv-agent.yml (example)
bv:
  api_base: "https://backupverified.com"
  timeout_seconds: 30
  work_timeout_seconds: 0
  upload_timeout_seconds: 0

agent_key: "YOUR_AGENT_KEY"
client_encryption_key_b64: "YOUR_CLIENT_ENCRYPTION_KEY_B64"

backup:
  source_key: "dir_backup"
  name: "Directory Backup"
  description: "Archive current directory via tar"
  delete_after_days: 0

source:
  type: "tar"
  backup_command: "tar -cf - ."