Remote OpenClaw Blog
OpenClaw Backup and Restore: Protect Your Agent Data [2026]
What changed
This post was reviewed and updated to reflect current deployment, security hardening, and operations guidance.
What should operators know about OpenClaw Backup and Restore: Protect Your Agent Data [2026]?
Answer: OpenClaw stores its data across several directories and files. Understanding what each contains helps you decide what to back up and how often. This guide covers practical deployment decisions, security controls, and operations steps to run OpenClaw, ClawDBot, or MOLTBot reliably in production on your own VPS.
Back up and restore your OpenClaw agent data. Covers memory, config, .env, skills, conversations. Manual backups, Docker volumes, git-based backup, cloud sync, and automated cron backups.
Marketplace
Free skills and AI personas for OpenClaw — deploy a pre-built agent in 15 minutes.
Browse the Marketplace →Join the Community
Join 500+ OpenClaw operators sharing deployment guides, security configs, and workflow automations.
What to Back Up
OpenClaw stores its data across several directories and files. Understanding what each contains helps you decide what to back up and how often.
| Location | Contains | Backup Priority |
|---|---|---|
data/ | Conversations, memory, vector embeddings, agent state | Critical |
config/ | Agent configuration, skill settings, integration configs | Critical |
.env | API keys, environment variables, secrets | Critical |
skills/ | Custom skill files (SKILL.md format) | High |
docker-compose.yml | Docker configuration (if applicable) | High |
logs/ | Application logs | Low (recreated automatically) |
The data directory is the most important. It contains your agent's conversation history, long-term memory, and vector embeddings that power semantic search. If you lose this, your agent loses all its learned context and conversation history. Rebuilding from scratch is possible but time-consuming.
The config directory contains your agent's personality, integration settings, cron job definitions, and skill configurations. Without this, your agent runs but with default settings — you would need to reconfigure everything manually.
The .env file contains your API keys and secrets. Without it, your agent cannot connect to any AI model provider. Always handle this file with extra care during backups — encrypt it or store it separately from other backup data.
Custom skills are SKILL.md files you have created or modified. Skills downloaded from ClawHub can be reinstalled, but custom skills exist only on your system.
Method 1: Manual Backup
The simplest backup method is creating a compressed archive of your OpenClaw directory:
# Create a timestamped backup
TIMESTAMP=$(date +%Y%m%d-%H%M%S)
tar -czf ~/openclaw-backup-$TIMESTAMP.tar.gz \
-C ~/openclaw \
data/ config/ .env skills/ docker-compose.yml 2>/dev/null
echo "Backup saved to ~/openclaw-backup-$TIMESTAMP.tar.gz"
ls -lh ~/openclaw-backup-$TIMESTAMP.tar.gz
This creates a single compressed file containing all essential data. For a typical personal agent running for several months, the backup file is usually 50-200MB.
Copy the backup off-machine. A backup that lives on the same disk as your agent is not really a backup — if the disk fails, you lose both. Copy it to another location:
# Copy to another machine via SCP
scp ~/openclaw-backup-$TIMESTAMP.tar.gz user@backup-server:/backups/
# Or copy to a USB drive
cp ~/openclaw-backup-$TIMESTAMP.tar.gz /Volumes/USB-Drive/backups/
Method 2: Docker Volume Backup
If you use Docker, your data lives in Docker volumes (or bind mounts). Backing up volumes requires a slightly different approach.
For bind mounts (volumes mapped to host directories in docker-compose.yml, like ./data:/app/data): Use the manual backup method above. The data is directly accessible on your host filesystem.
For named Docker volumes:
# List volumes
docker volume ls | grep openclaw
# Back up a named volume to a tar file
docker run --rm \
-v openclaw_data:/source:ro \
-v ~/backups:/backup \
alpine tar -czf /backup/openclaw-data-$(date +%Y%m%d).tar.gz -C /source .
# Back up another volume
docker run --rm \
-v openclaw_config:/source:ro \
-v ~/backups:/backup \
alpine tar -czf /backup/openclaw-config-$(date +%Y%m%d).tar.gz -C /source .
This uses a temporary Alpine container to access the Docker volume contents and create a compressed archive on your host filesystem.
Method 3: Git-Based Backup
Git provides version-controlled backups, letting you track changes over time and restore to any previous state. This is excellent for configuration and skill files, and workable for data files.
Step 1: Initialize a git repository.
cd ~/openclaw
# Initialize git
git init
# Create .gitignore to exclude sensitive files and large binaries
cat > .gitignore << 'EOF'
# Never commit API keys
.env
# Exclude logs
logs/
# Exclude large binary files in data (optional)
data/*.db-wal
data/*.db-shm
# Exclude Docker files
docker-compose.yml
EOF
# Initial commit
git add -A
git commit -m "Initial OpenClaw backup"
Step 2: Set up a private remote repository.
# Create a private repo on GitHub, GitLab, or Gitea
# Then add as remote
git remote add origin git@github.com:your-username/openclaw-backup.git
git push -u origin main
Step 3: Commit changes regularly. You can do this manually or automate it with a cron job:
# Add to crontab: commit and push daily at 2 AM
0 2 * * * cd ~/openclaw && git add -A && git commit -m "Daily backup $(date +\%Y\%m\%d)" 2>/dev/null && git push origin main 2>/dev/null
Important: Never commit your .env file to git — it contains API keys. Back up .env separately using manual copy or an encrypted backup method.
Backing up .env securely:
# Encrypt .env before backing up
gpg --symmetric --cipher-algo AES256 -o ~/openclaw/.env.gpg ~/openclaw/.env
# Add .env.gpg to git (encrypted version is safe to commit)
git add .env.gpg
git commit -m "Update encrypted .env backup"
Method 4: Cloud Sync
Cloud sync provides off-site backup with minimal effort. Use rclone to sync your backups to S3, Google Drive, Dropbox, or dozens of other cloud storage providers.
Install rclone:
# Linux/macOS
curl https://rclone.org/install.sh | sudo bash
# Or via Homebrew on macOS
brew install rclone
Configure a remote (example with Google Drive):
rclone config
# Follow the interactive setup to add a Google Drive remote
# Name it "gdrive" or similar
Sync your backups:
# Sync backup directory to cloud
rclone sync ~/backups/openclaw gdrive:openclaw-backups/
# Or sync directly from the OpenClaw directory (excluding .env)
rclone sync ~/openclaw gdrive:openclaw-backups/ \
--exclude ".env" \
--exclude "logs/**" \
--exclude "node_modules/**"
For AWS S3:
rclone sync ~/backups/openclaw s3:my-bucket/openclaw-backups/
Automated Backups with Cron
The best backup is one that runs automatically. Create a backup script and schedule it with cron.
Create the backup script:
cat > ~/openclaw-backup.sh << 'SCRIPT'
#!/bin/bash
set -e
BACKUP_DIR=~/backups/openclaw
TIMESTAMP=$(date +%Y%m%d-%H%M%S)
ARCHIVE=$BACKUP_DIR/openclaw-$TIMESTAMP.tar.gz
# Create backup directory
mkdir -p $BACKUP_DIR
# Create compressed archive
tar -czf $ARCHIVE \
-C ~/openclaw \
data/ config/ .env skills/ docker-compose.yml 2>/dev/null
# Remove backups older than 30 days
find $BACKUP_DIR -name "openclaw-*.tar.gz" -mtime +30 -delete
# Optional: sync to cloud
# rclone copy $ARCHIVE gdrive:openclaw-backups/
echo "$(date): Backup created at $ARCHIVE" >> ~/backups/openclaw-backup.log
SCRIPT
chmod +x ~/openclaw-backup.sh
Schedule with cron:
# Edit crontab
crontab -e
# Add daily backup at 3 AM
0 3 * * * ~/openclaw-backup.sh
This creates a daily backup, keeps the last 30 days of backups, and logs each operation. Adding the rclone line provides off-site protection.
How to Restore from Backup
Restoring from a tar archive:
# Stop OpenClaw
docker compose down # or: sudo systemctl stop openclaw
# Remove current data (or move it aside)
mv ~/openclaw/data ~/openclaw/data.old
mv ~/openclaw/config ~/openclaw/config.old
# Extract backup
tar -xzf ~/backups/openclaw/openclaw-20260324-030000.tar.gz -C ~/openclaw
# Restore .env if needed
# cp ~/backups/openclaw/.env.backup ~/openclaw/.env
# Start OpenClaw
docker compose up -d # or: sudo systemctl start openclaw
# Verify
docker compose logs -f --tail=20 openclaw
Restoring from git:
# Stop OpenClaw
docker compose down
# Restore to a specific commit
cd ~/openclaw
git log --oneline # find the commit you want
git checkout abc1234 -- data/ config/ skills/
# Decrypt .env if using gpg backup
gpg --decrypt .env.gpg > .env
# Start OpenClaw
docker compose up -d
Restoring Docker volumes:
# Stop containers
docker compose down
# Restore volume from backup
docker run --rm \
-v openclaw_data:/target \
-v ~/backups:/backup \
alpine sh -c "rm -rf /target/* && tar -xzf /backup/openclaw-data-20260324.tar.gz -C /target"
# Start containers
docker compose up -d
Migrating OpenClaw to a New Server
Moving OpenClaw from one server to another is essentially a backup-and-restore operation across machines:
- Back up on the old server: Create a full backup archive including data, config, .env, skills, and docker-compose.yml.
- Transfer to the new server: Use scp, rsync, or a cloud storage intermediary to move the backup file.
- Install OpenClaw on the new server: Follow the installation guide for your preferred method (Docker or npm).
- Restore the backup: Extract the backup archive into the new OpenClaw directory.
- Update configurations: Change any server-specific settings (IP addresses, domain names, ports) in the .env file and config.
- Start and verify: Start OpenClaw and verify all integrations, cron jobs, and skills are working.
# On old server: create backup and transfer
tar -czf /tmp/openclaw-migration.tar.gz -C ~/openclaw data/ config/ .env skills/ docker-compose.yml
scp /tmp/openclaw-migration.tar.gz user@new-server:/tmp/
# On new server: install and restore
mkdir -p ~/openclaw && cd ~/openclaw
tar -xzf /tmp/openclaw-migration.tar.gz
docker compose up -d
Backup Best Practices
- Follow the 3-2-1 rule: Keep 3 copies of your data, on 2 different storage types, with 1 copy off-site. For example: local disk + external drive + cloud storage.
- Test your restores. A backup you have never tested restoring from is not a reliable backup. Every few months, spin up a test instance and restore from backup to verify the process works.
- Encrypt sensitive backups. Backups containing your .env file have your API keys. Encrypt before storing off-site.
- Automate everything. Manual backups are backups you will forget to make. Set up a cron job and cloud sync so backups happen without your intervention.
- Monitor your backups. Check that backup files are being created (the backup log helps). Set up an alert if backups stop running — a silent failure is the worst kind.
- Keep retention reasonable. 30 days of daily backups is sufficient for most operators. Longer retention for weekly snapshots (keep 12 weeks) provides deeper history without using excessive storage.
