Skip to main content

Database Schema Migration Skill for AI Agents

Agent Skills are an open standard for packaging domain expertise for AI agents. Introduced by Anthropic and adopted by OpenAI Codex CLI, Cursor, and other coding agents, skills enable AI to perform specialized tasks using deterministic tooling.

AI agents can edit schema files the same way they edit code. But generating migrations is a different problem: migrations require understanding object dependencies and safe change ordering across the entire schema - tables, views, functions, triggers, and constraints all depend on each other. Schema changes must be validated before applying them, and they must follow organizational policies - naming conventions, compliance requirements, and team standards. Most importantly, migrations must be deterministic, and the same schema change should always produce the same migration, regardless of which model or prompt generated it.

This skill teaches AI agents to manage the entire migration lifecycle: from diff generation and migration creation, through linting and testing, to safe production deployment. It covers multi-dialect support (MySQL, PostgreSQL, SQLite, SQL Server, ClickHouse, and more) and integrates with ORMs like GORM, Drizzle, SQLAlchemy, Django, and more.

Installation

OpenAI Codex CLI supports skills natively. Create a skill directory and add the SKILL.md file:

mkdir -p ~/.codex/skills/atlas
~/.codex/skills/atlas/SKILL.md
---
name: atlas-schema-migrations
description: "Database schema migrations and schema-as-code operations. Use when: (1) Generating migration files from schema changes, (2) Diffing or comparing database schemas, (3) Linting migrations for safety and policy compliance, (4) Testing migrations before committing, (5) Applying migrations to databases, (6) Inspecting or validating schema definitions."
version: 1.0.0
---

# Atlas Schema Migrations

## Security

Never hardcode credentials. Use environment variables:

```hcl
env "prod" {
url = getenv("DATABASE_URL")
}
```

## Quick Reference

Use `--help` on any command for comprehensive docs and examples:
```bash
atlas migrate diff --help
```

Always use `--env` to reference configurations from `atlas.hcl` - this avoids passing
database credentials to the LLM context.

```bash
# Common
atlas schema inspect --env <name> # Inspect schema
atlas schema validate --env <name> # Validate schema syntax/semantics
atlas schema diff --env <name> # Compare schemas
atlas schema lint --env <name> # Check schema policies
atlas schema test --env <name> # Test schema

# Declarative workflow
atlas schema plan --env <name> # Plan schema changes
atlas schema apply --env <name> --dry-run # Preview changes
atlas schema apply --env <name> # Apply schema changes

# Versioned workflow
atlas migrate diff --env <name> "migration_name" # Generate migration
atlas migrate lint --env <name> --latest 1 # Validate migration
atlas migrate test --env <name> # Test migration
atlas migrate apply --env <name> --dry-run # Preview changes
atlas migrate apply --env <name> # Apply migration
atlas migrate status --env <name> # Check status
```

## Core Concepts

### Configuration File (atlas.hcl)

Always read the project's `atlas.hcl` first - it contains environment configurations:

```hcl
env "<name>" {
url = getenv("DATABASE_URL")
dev = "docker://postgres/15/dev?search_path=public"

migration {
dir = "file://migrations"
}

schema {
src = "file://schema.hcl"
}
}
```

### Dev Database

Atlas uses a temporary "dev-database" to process and validate schemas. Common configurations:

```bash
--dev-url "docker://mysql/8/dev"
--dev-url "docker://postgres/15/db_name?search_path=public"
--dev-url "sqlite://dev?mode=memory"
```

## Workflows

### 1. Schema Inspection

Start with a high-level overview before diving into details:

```bash
# Login (required for Pro features like views, triggers, functions)
atlas login

# List tables (overview first)
atlas schema inspect --env <name> --format "{{ json . }}" | jq ".schemas[].tables[].name"

# Full SQL schema
atlas schema inspect --env <name> --format "{{ sql . }}"

# Filter with --include/--exclude (useful for large schemas)
atlas schema inspect --env <name> --include "users_*" # Only matching tables
atlas schema inspect --env <name> --exclude "*_backup" # Skip matching tables
atlas schema inspect --env <name> --exclude "*[type=trigger]" # Skip triggers

# Open visual ERD in browser
atlas schema inspect --env <name> -w
```

### 2. Schema Comparison (Diff)

Compare any two schema states:

```bash
# Compare current state to desired schema
atlas schema diff --env <name>

# Compare specific sources
atlas schema diff --env <name> --from file://migrations --to file://schema.hcl
```

### 3. Migration Generation

Generate migrations from schema changes:

```bash
# Generate migration from schema diff
atlas migrate diff --env <name> "add_users_table"

# With explicit parameters
atlas migrate diff \
--dir file://migrations \
--dev-url docker://postgres/15/dev \
--to file://schema.hcl \
"add_users_table"
```

### 4. Schema Validation

**Validate schema definitions before generating migrations:**

```bash
# Validate schema syntax and semantics
atlas schema validate --env <name>

# Validate against dev database
atlas schema validate --dev-url docker://postgres/15/dev --url file://schema.hcl
```

If valid, exits successfully. If invalid, prints detailed error (unresolved references, syntax issues, unsupported attributes).

### 5. Migration Linting

```bash
atlas migrate lint --env <name> --latest 1 # Lint latest migration
atlas migrate lint --env ci # Lint since git branch
atlas schema lint --env <name> # Check schema policies
```

Fixing lint issues:
- Unapplied migrations: Edit file, then `atlas migrate hash --env <name>`
- Applied migrations: Create corrective migration (never edit directly)

### 6. Migration Testing

```bash
atlas migrate test --env <name> # Requires Atlas Pro
atlas whoami # Check login status first
```

### 7. Applying Migrations

```bash
atlas migrate apply --env <name> --dry-run # Always preview first
atlas migrate apply --env <name> # Apply
atlas migrate status --env <name> # Verify
```

## Standard Workflow

1. `atlas schema inspect --env <name>` - Understand current state
2. Edit schema files
3. `atlas schema validate --env <name>` - Check syntax
4. `atlas migrate diff --env <name> "change_name"` - Generate migration
5. `atlas migrate lint --env <name> --latest 1` - Validate
6. `atlas migrate test --env <name>` - Test (Pro)
7. If issues: edit migration, then `atlas migrate hash`
8. `atlas migrate apply --env <name> --dry-run` then apply

## Schema Sources

### HCL Schema
```hcl
data "hcl_schema" "<name>" {
path = "schema.hcl"
}
```

### External Schema (ORM Integration)
```hcl
data "external_schema" "drizzle" {
program = ["npx", "drizzle-kit", "export"]
}

data "external_schema" "django" {
program = ["python", "manage.py", "atlas-provider-django", "--dialect", "postgresql"]
}
```

### Composite Schema (Pro)
```hcl
data "composite_schema" "app" {
schema "users" {
url = data.external_schema.auth_service.url
}
schema "graph" {
url = "ent://ent/schema"
}
}
```

## Troubleshooting

```bash
# Check installation and login
atlas version
atlas whoami

# Repair migration integrity after manual edits
atlas migrate hash --env <name>
```

**Missing driver error**: Ensure `--url` or `--dev-url` is correctly specified.

## Key Rules

1. Read `atlas.hcl` first - use environment names from config
2. Never hardcode credentials - use `getenv()`
3. Run `atlas schema validate` after schema edits
4. Always lint before applying migrations
5. Always dry-run before applying
6. Run `atlas migrate hash` after editing migration files
7. Write migration tests for data migrations
8. Write schema tests for complex logic (functions, triggers, procedures)
9. Check `atlas whoami` before Pro features
10. Never ignore lint errors

## Documentation

- https://atlasgo.io/docs
- https://atlasgo.io/cli-reference
- https://atlasgo.io/concepts/dev-database

Codex will automatically load the skill when database operations are requested.

What the Skill Covers

AI agents are great at writing code, but generating database migrations is a different challenge. As schemas grow larger and more complex, ensuring migrations are deterministic, predictable, and aligned with company policies becomes critical.

Atlas lets AI tools focus on editing the schema while providing the infrastructure for:

  • Schema Validation - verifying schema definitions are syntactically and semantically correct before generating migrations
  • Migration Generation - producing correct, safe, and deterministic migrations automatically
  • Migration Validation - ensuring migrations are semantically correct and follow best practices
  • Policy Enforcement - applying organizational rules for what changes are allowed and how they should be applied
  • Unit Testing - AI tools can write logic (functions, views, queries) along with tests, while Atlas executes them, reports failures, and lets the AI fix issues automatically
  • Data Migration Testing - AI tools can generate data migrations, seed data, run tests, detect errors, and guide the AI in resolving them

Skills package this knowledge into a portable SKILL.md file that works across Claude, Codex, Cursor, and other agents. The agent loads the skill when database operations are requested, then follows the workflows and commands defined in the skill.

Example Usage

When a user asks the agent to add a column to a table, the agent follows the skill's workflow:

User: Add an email column to the users table

Agent: I'll use Atlas to add this column safely.

1. Inspecting current schema...
$ atlas schema inspect --env dev

2. Adding column to schema file...
[edits schema.hcl]

3. Validating schema...
$ atlas schema validate --env dev
✓ Schema is valid

4. Generating migration...
$ atlas migrate diff --env dev "add_users_email"
Created: migrations/20250101120000_add_users_email.sql

5. Linting migration...
$ atlas migrate lint --env dev --latest 1
✓ No issues found

6. Ready to apply (dry-run first)...
$ atlas migrate apply --env dev --dry-run

Full Skill Content

View full SKILL.md
SKILL.md
---
name: atlas-schema-migrations
description: "Database schema migrations and schema-as-code operations. Use when: (1) Generating migration files from schema changes, (2) Diffing or comparing database schemas, (3) Linting migrations for safety and policy compliance, (4) Testing migrations before committing, (5) Applying migrations to databases, (6) Inspecting or validating schema definitions."
version: 1.0.0
---

# Atlas Schema Migrations

## Security

Never hardcode credentials. Use environment variables:

```hcl
env "prod" {
url = getenv("DATABASE_URL")
}
```

## Quick Reference

Use `--help` on any command for comprehensive docs and examples:
```bash
atlas migrate diff --help
```

Always use `--env` to reference configurations from `atlas.hcl` - this avoids passing
database credentials to the LLM context.

```bash
# Common
atlas schema inspect --env <name> # Inspect schema
atlas schema validate --env <name> # Validate schema syntax/semantics
atlas schema diff --env <name> # Compare schemas
atlas schema lint --env <name> # Check schema policies
atlas schema test --env <name> # Test schema

# Declarative workflow
atlas schema plan --env <name> # Plan schema changes
atlas schema apply --env <name> --dry-run # Preview changes
atlas schema apply --env <name> # Apply schema changes

# Versioned workflow
atlas migrate diff --env <name> "migration_name" # Generate migration
atlas migrate lint --env <name> --latest 1 # Validate migration
atlas migrate test --env <name> # Test migration
atlas migrate apply --env <name> --dry-run # Preview changes
atlas migrate apply --env <name> # Apply migration
atlas migrate status --env <name> # Check status
```

## Core Concepts

### Configuration File (atlas.hcl)

Always read the project's `atlas.hcl` first - it contains environment configurations:

```hcl
env "<name>" {
url = getenv("DATABASE_URL")
dev = "docker://postgres/15/dev?search_path=public"

migration {
dir = "file://migrations"
}

schema {
src = "file://schema.hcl"
}
}
```

### Dev Database

Atlas uses a temporary "dev-database" to process and validate schemas. Common configurations:

```bash
--dev-url "docker://mysql/8/dev"
--dev-url "docker://postgres/15/db_name?search_path=public"
--dev-url "sqlite://dev?mode=memory"
```

## Workflows

### 1. Schema Inspection

Start with a high-level overview before diving into details:

```bash
# Login (required for Pro features like views, triggers, functions)
atlas login

# List tables (overview first)
atlas schema inspect --env <name> --format "{{ json . }}" | jq ".schemas[].tables[].name"

# Full SQL schema
atlas schema inspect --env <name> --format "{{ sql . }}"

# Filter with --include/--exclude (useful for large schemas)
atlas schema inspect --env <name> --include "users_*" # Only matching tables
atlas schema inspect --env <name> --exclude "*_backup" # Skip matching tables
atlas schema inspect --env <name> --exclude "*[type=trigger]" # Skip triggers

# Open visual ERD in browser
atlas schema inspect --env <name> -w
```

### 2. Schema Comparison (Diff)

Compare any two schema states:

```bash
# Compare current state to desired schema
atlas schema diff --env <name>

# Compare specific sources
atlas schema diff --env <name> --from file://migrations --to file://schema.hcl
```

### 3. Migration Generation

Generate migrations from schema changes:

```bash
# Generate migration from schema diff
atlas migrate diff --env <name> "add_users_table"

# With explicit parameters
atlas migrate diff \
--dir file://migrations \
--dev-url docker://postgres/15/dev \
--to file://schema.hcl \
"add_users_table"
```

### 4. Schema Validation

**Validate schema definitions before generating migrations:**

```bash
# Validate schema syntax and semantics
atlas schema validate --env <name>

# Validate against dev database
atlas schema validate --dev-url docker://postgres/15/dev --url file://schema.hcl
```

If valid, exits successfully. If invalid, prints detailed error (unresolved references, syntax issues, unsupported attributes).

### 5. Migration Linting

```bash
atlas migrate lint --env <name> --latest 1 # Lint latest migration
atlas migrate lint --env ci # Lint since git branch
atlas schema lint --env <name> # Check schema policies
```

Fixing lint issues:
- Unapplied migrations: Edit file, then `atlas migrate hash --env <name>`
- Applied migrations: Create corrective migration (never edit directly)

### 6. Migration Testing

```bash
atlas migrate test --env <name> # Requires Atlas Pro
atlas whoami # Check login status first
```

### 7. Applying Migrations

```bash
atlas migrate apply --env <name> --dry-run # Always preview first
atlas migrate apply --env <name> # Apply
atlas migrate status --env <name> # Verify
```

## Standard Workflow

1. `atlas schema inspect --env <name>` - Understand current state
2. Edit schema files
3. `atlas schema validate --env <name>` - Check syntax
4. `atlas migrate diff --env <name> "change_name"` - Generate migration
5. `atlas migrate lint --env <name> --latest 1` - Validate
6. `atlas migrate test --env <name>` - Test (Pro)
7. If issues: edit migration, then `atlas migrate hash`
8. `atlas migrate apply --env <name> --dry-run` then apply

## Schema Sources

### HCL Schema
```hcl
data "hcl_schema" "<name>" {
path = "schema.hcl"
}
```

### External Schema (ORM Integration)
```hcl
data "external_schema" "drizzle" {
program = ["npx", "drizzle-kit", "export"]
}

data "external_schema" "django" {
program = ["python", "manage.py", "atlas-provider-django", "--dialect", "postgresql"]
}
```

### Composite Schema (Pro)
```hcl
data "composite_schema" "app" {
schema "users" {
url = data.external_schema.auth_service.url
}
schema "graph" {
url = "ent://ent/schema"
}
}
```

## Troubleshooting

```bash
# Check installation and login
atlas version
atlas whoami

# Repair migration integrity after manual edits
atlas migrate hash --env <name>
```

**Missing driver error**: Ensure `--url` or `--dev-url` is correctly specified.

## Key Rules

1. Read `atlas.hcl` first - use environment names from config
2. Never hardcode credentials - use `getenv()`
3. Run `atlas schema validate` after schema edits
4. Always lint before applying migrations
5. Always dry-run before applying
6. Run `atlas migrate hash` after editing migration files
7. Write migration tests for data migrations
8. Write schema tests for complex logic (functions, triggers, procedures)
9. Check `atlas whoami` before Pro features
10. Never ignore lint errors

## Documentation

- https://atlasgo.io/docs
- https://atlasgo.io/cli-reference
- https://atlasgo.io/concepts/dev-database