Remote OpenClaw

Remote OpenClaw Blog

How to Use OpenClaw Skills for Database Migrations

7 min read ·

Database migrations are one of the most anxiety-inducing parts of software development. A bad migration can corrupt data, bring down production, or create a mess that takes hours to untangle. OpenClaw skills can reduce that risk by helping your agent generate correct migration files, plan rollback strategies, and follow proven patterns for schema changes.

This guide covers how to set up OpenClaw skills for database migration workflows, from initial schema design through production deployment. We will cover PostgreSQL, MySQL, and MongoDB, along with popular migration tools like Knex.js, Prisma, TypeORM, and Django.

Installing Migration Skills

Start by installing the core migration skill and a database-specific skill from the OpenClaw Bazaar skills directory:

# Core migration patterns
openclaw skill install db-migration-patterns

# Pick your database
openclaw skill install postgresql-migrations   # PostgreSQL
openclaw skill install mysql-migrations         # MySQL
openclaw skill install mongodb-migrations       # MongoDB

Then install a skill for your migration tool:

# Pick your migration tool
openclaw skill install prisma-migrations        # Prisma
openclaw skill install knex-migrations          # Knex.js
openclaw skill install typeorm-migrations       # TypeORM
openclaw skill install django-migrations        # Django

These skills work together. The core skill provides general migration best practices. The database skill adds engine-specific knowledge. The tool skill handles the syntax and conventions of your chosen migration framework.

Schema Design With OpenClaw

Good migrations start with good schema design. The db-migration-patterns skill teaches your agent to think about schema changes in terms of safety, reversibility, and performance impact.

When you ask your agent to design a schema, it considers:

// The agent generates schema designs with migration safety in mind

// Example: Adding a new feature for user notifications
// The agent breaks this into safe, incremental migrations

// Migration 1: Create the notifications table
export async function up(knex: Knex): Promise<void> {
  await knex.schema.createTable("notifications", (table) => {
    table.uuid("id").primary().defaultTo(knex.fn.uuid());
    table.uuid("user_id").notNullable();
    table.string("type", 50).notNullable();
    table.text("message").notNullable();
    table.boolean("read").notNullable().defaultTo(false);
    table.timestamp("created_at").notNullable().defaultTo(knex.fn.now());

    // Foreign key with index for query performance
    table.foreign("user_id").references("users.id").onDelete("CASCADE");
    table.index(["user_id", "read", "created_at"]);
  });
}

export async function down(knex: Knex): Promise<void> {
  await knex.schema.dropTableIfExists("notifications");
}

Notice what the agent includes automatically: a UUID primary key instead of auto-increment for distributed safety, a composite index that matches the most common query pattern, a foreign key with CASCADE delete, and timestamps with server-side defaults. The down function provides a clean rollback. These are patterns the migration skills encode.

Generating Migration Files

The most common workflow is asking your agent to generate a migration for a specific change. The migration skills teach the agent to produce files that match your tool's conventions:

# Ask your agent:
# "Generate a migration to add an email_verified column to the users table"

With Prisma, the agent updates your schema file and explains the migration:

// schema.prisma — the agent adds the new field
model User {
  id             String   @id @default(uuid())
  email          String   @unique
  name           String
  emailVerified  Boolean  @default(false)  // New field
  createdAt      DateTime @default(now())
  updatedAt      DateTime @updatedAt
}
# Then tells you to run:
npx prisma migrate dev --name add-email-verified-to-users

With Knex.js, the agent generates a complete migration file:

// migrations/20260329120000_add_email_verified_to_users.ts
import { Knex } from "knex";

export async function up(knex: Knex): Promise<void> {
  await knex.schema.alterTable("users", (table) => {
    table.boolean("email_verified").notNullable().defaultTo(false);
  });
}

export async function down(knex: Knex): Promise<void> {
  await knex.schema.alterTable("users", (table) => {
    table.dropColumn("email_verified");
  });
}

The key pattern here is the defaultTo(false) on the new column. Without the migration skill, agents often generate NOT NULL columns without defaults, which fails on tables with existing data. The skill ensures the agent always considers existing rows when adding columns.

Handling Dangerous Migrations

Some migrations are inherently risky: renaming columns, changing column types, dropping tables, or modifying constraints on large tables. The migration skills teach your agent to flag these and suggest safer alternatives.

When you ask your agent to rename a column, it generates a multi-step migration instead of a single ALTER TABLE RENAME:

// Step 1: Add the new column
export async function up(knex: Knex): Promise<void> {
  // Add new column alongside the old one
  await knex.schema.alterTable("users", (table) => {
    table.string("display_name", 255);
  });

  // Copy data from old column to new column
  await knex.raw(\`
    UPDATE users SET display_name = full_name
  \`);

  // Make new column NOT NULL after data is copied
  await knex.schema.alterTable("users", (table) => {
    table.string("display_name", 255).notNullable().alter();
  });
}

// Step 2 (separate migration, deployed after code changes):
// Drop the old column once all application code uses the new name
export async function up(knex: Knex): Promise<void> {
  await knex.schema.alterTable("users", (table) => {
    table.dropColumn("full_name");
  });
}

The agent explains why this two-step approach is safer: it allows you to deploy the new column first, update application code to use the new name, verify everything works, and then drop the old column. A direct rename would require deploying the migration and the code change simultaneously, which is riskier.

Marketplace

Free skills and AI personas for OpenClaw — browse the marketplace.

Browse the Marketplace →

Rollback Strategies

Every migration should be reversible. The migration skills enforce this by ensuring every up function has a corresponding down function. But rollback strategy goes beyond individual migrations.

Configure your rollback preferences:

# .openclaw/migration-config.yaml
migrations:
  rollback:
    require_down: true
    test_rollback: true
    max_batch_size: 5
    pre_deploy_snapshot: true
    dangerous_operations:
      - drop_table
      - drop_column
      - change_type
      - remove_index
    dangerous_operation_handling: require_confirmation

With test_rollback: true, the agent generates migration tests that verify both the up and down paths:

// migrations/__tests__/20260329120000_add_notifications.test.ts
import { createTestDatabase, runMigrations, rollback } from "../test-helpers";

describe("add notifications migration", () => {
  let db: Knex;

  beforeEach(async () => {
    db = await createTestDatabase();
  });

  afterEach(async () => {
    await db.destroy();
  });

  it("creates the notifications table on up", async () => {
    await runMigrations(db, "20260329120000");

    const hasTable = await db.schema.hasTable("notifications");
    expect(hasTable).toBe(true);

    const columns = await db("notifications").columnInfo();
    expect(columns).toHaveProperty("id");
    expect(columns).toHaveProperty("user_id");
    expect(columns).toHaveProperty("type");
    expect(columns).toHaveProperty("read");
  });

  it("drops the notifications table on down", async () => {
    await runMigrations(db, "20260329120000");
    await rollback(db, "20260329120000");

    const hasTable = await db.schema.hasTable("notifications");
    expect(hasTable).toBe(false);
  });

  it("preserves existing data in other tables", async () => {
    // Seed test data
    await db("users").insert({ id: "test-user", email: "test@example.com" });

    await runMigrations(db, "20260329120000");
    await rollback(db, "20260329120000");

    const users = await db("users").select("*");
    expect(users).toHaveLength(1);
  });
});

These tests catch issues like migrations that silently drop data during rollback, or down functions that fail because they do not account for foreign key constraints.

Multi-Database Support

If your project uses multiple databases — say PostgreSQL for transactional data and MongoDB for analytics — you need skills that handle both:

openclaw skill install postgresql-migrations
openclaw skill install mongodb-migrations
openclaw skill install multi-db-coordinator

The multi-db-coordinator skill teaches your agent to manage migrations across databases. It handles cross-database migration ordering, ensuring that the PostgreSQL migration creating a users table runs before the MongoDB migration that references user IDs.

For MongoDB, migrations look different since there is no rigid schema:

// mongodb-migrations/20260329120000_add_analytics_indexes.ts
import { Db } from "mongodb";

export async function up(db: Db): Promise<void> {
  // Create collection with schema validation
  await db.createCollection("page_views", {
    validator: {
      $jsonSchema: {
        bsonType: "object",
        required: ["userId", "path", "timestamp"],
        properties: {
          userId: { bsonType: "string" },
          path: { bsonType: "string" },
          timestamp: { bsonType: "date" },
          duration: { bsonType: "int" },
          metadata: { bsonType: "object" },
        },
      },
    },
  });

  // Create indexes for common query patterns
  await db.collection("page_views").createIndex(
    { userId: 1, timestamp: -1 },
    { name: "idx_user_timeline" }
  );

  await db.collection("page_views").createIndex(
    { timestamp: 1 },
    { expireAfterSeconds: 7776000, name: "idx_ttl_90_days" }
  );
}

export async function down(db: Db): Promise<void> {
  await db.dropCollection("page_views");
}

The agent adds schema validation on the MongoDB collection — a best practice that many teams skip — and includes a TTL index for automatic data expiration. These are patterns the MongoDB migration skill encodes.

Production Deployment Checklist

The migration skills also help with deployment. When you tell your agent you are preparing a migration for production, it generates a deployment checklist:

## Migration Deployment Checklist

- [ ] All migrations have tested \`up\` and \`down\` functions
- [ ] Migrations run successfully on a copy of production data
- [ ] No migrations lock tables for extended periods
- [ ] New columns have defaults (won't break existing queries)
- [ ] Indexes are created concurrently where supported
- [ ] Rollback plan is documented and tested
- [ ] Database backup completed before deployment
- [ ] Application code is backward-compatible with pre-migration schema

This checklist comes from the db-migration-patterns skill and adapts based on the specific migrations being deployed. If your migration includes a table lock, the checklist adds a note about expected downtime. If you are creating an index on a large table, it reminds you to use CREATE INDEX CONCURRENTLY on PostgreSQL.

Database migrations do not have to be stressful. With the right OpenClaw skills, your agent generates safe, tested, reversible migrations that follow production-proven patterns. Browse the full set of database skills in the OpenClaw Bazaar skills directory to find the right combination for your stack.


Browse the Skills Directory

Find the right skill for your workflow. The OpenClaw Bazaar skills directory has over 2,300 community-rated skills — searchable, sortable, and free to install.

Browse Skills →

Try a Pre-Built Persona

Don't want to configure everything from scratch? OpenClaw personas come pre-loaded with skills, memory templates, and workflows designed for specific roles. Compare personas →