Skip to main content

OpenAI Codex with Atlas

OpenAI Codex is OpenAI's agentic coding tool. It runs in the terminal (CLI), VS Code, and as a standalone app. Codex supports agent skills natively via SKILL.md files.

Atlas offers two levels of integration with Codex, from a simple instructions file to a portable agent skill.

Option 1: Instructions File (Basic)

The simplest way to integrate Atlas with Codex. Create an AGENTS.md file in your project root that teaches Codex about Atlas concepts, commands, and workflows.

Full AGENTS.md content (click to expand)
AGENTS.md
# Atlas Database Schema Management

Atlas is a language-independent tool for managing and migrating database schemas using modern DevOps principles.

## Quick Reference

```bash
atlas schema inspect --env <name>
atlas schema validate --env <name>
atlas migrate status --env <name>
atlas migrate diff --env <name>
atlas migrate lint --env <name> --latest 1
atlas migrate apply --env <name>
atlas whoami
```

## Core Concepts and Configurations

### Configuration File Structure
Atlas uses `atlas.hcl` configuration files with the following structure:

```hcl
env "<name>" {
url = getenv("DATABASE_URL")
dev = "docker://postgres/15/dev?search_path=public"

migration {
dir = "file://migrations"
}

schema {
src = "file://schema.hcl"
}
}
```

### Dev Database
Atlas uses a temporary "dev-database" to process and validate schemas. The URL format depends on whether the project uses a single schema or multiple schemas:

```
# Schema-scoped (single schema — most common)
--dev-url "docker://mysql/8/dev"
--dev-url "docker://postgres/15/dev?search_path=public"
--dev-url "sqlite://dev?mode=memory"
--dev-url "docker://sqlserver/2022-latest/dev?mode=schema"

# Database-scoped (multiple schemas, extensions, or event triggers)
--dev-url "docker://mysql/8"
--dev-url "docker://postgres/15/dev"
--dev-url "docker://sqlserver/2022-latest/dev?mode=database"

# PostGIS / pgvector
--dev-url "docker://postgis/latest/dev?search_path=public"
--dev-url "docker://pgvector/pg17/dev?search_path=public"
```

**Important:** Using the wrong scope causes errors (`ModifySchema is not allowed`) or silently drops database-level objects from migrations. Match the dev URL scope to the project's target database URL.

See https://atlasgo.io/concepts/dev-database for additional drivers and options.

### Environment Variables and Security

**DO**: Use secure configuration patterns
```hcl
// Using environment variables (recommended)
env "<name>" {
url = getenv("DATABASE_URL")
}

// Using external data sources
data "external" "envfile" {
program = ["npm", "run", "envfile.js"]
}

locals {
envfile = jsondecode(data.external.envfile)
}

env "<name>" {
url = local.envfile.DATABASE_URL
}
```

**DON'T**: Hardcode sensitive values
```hcl
// Never do this
env "prod" {
url = "postgres://user:password123@prod-host:5432/database"
}
```

### Schema Sources

#### HCL Schema
```hcl
data "hcl_schema" "<name>" {
path = "schema.hcl"
}

env "<name>" {
schema {
src = data.hcl_schema.<name>.url
}
}
```

#### External Schema (ORM Integration)
The `external_schema` data source imports SQL schema from an external program.

```hcl
data "external_schema" "drizzle" {
program = ["npx", "drizzle-kit", "export"]
}

data "external_schema" "django" {
program = ["python", "manage.py", "atlas-provider-django", "--dialect", "postgresql"]
}

env "<name>" {
schema {
src = data.external_schema.django.url
}
}
```

**Important:** The output must be a complete SQL schema (not a diff). If errors occur, run the program directly to isolate the issue.

#### Composite Schema (Pro)
Combine multiple schemas into one. Requires `atlas login`.

```hcl
data "composite_schema" "app" {
schema "users" {
url = data.external_schema.auth_service.url
}
schema "graph" {
url = "ent://ent/schema"
}
}

env "<name>" {
schema {
src = data.composite_schema.app.url
}
}
```

## Common Workflows

### 1. Schema Inspection / Visualization

1. Start by listing tables — don't inspect the entire schema at once for large databases.
2. Default output is HCL. Use `--format "{{ json . }}"` for JSON or `--format "{{ sql . }}"` for SQL.
3. Use `--include`/`--exclude` to filter specific tables or objects.

**Inspect the environment's schema source (`env://src`):**
```bash
atlas schema inspect --env <name> --url "env://src" --format "{{ sql . }}"
atlas schema inspect --env <name> --url "env://src" --format "{{ json . }}" | jq ".schemas[].tables[].name"
```

**Inspect the environment's target database:**
```bash
atlas schema inspect --env <name> --format "{{ sql . }}"
atlas schema inspect --env <name> --include "users" --format "{{ sql . }}"
```

**Inspect migration directory:**
```bash
atlas schema inspect --env <name> --url file://migrations --format "{{ sql . }}"
```

Add `-w` to open a web-based ERD visualization (requires `atlas login`).

### 2. Migration Status

Compare applied migrations against the migrations directory. Only use when you know the target database.

```bash
atlas migrate status --env <name>
atlas migrate status --dir file://migrations --url <url>
```

### 3. Migration Generation / Diffing

```bash
atlas migrate diff --env <name> "add_user_table"

atlas migrate diff \
--dir file://migrations \
--dev-url docker://postgres/15/dev \
--to file://schema.hcl \
"add_user_table"
```

**Configuration for migration generation:**
```hcl
env "<name>" {
dev = "docker://postgres/15/dev?search_path=public"

migration {
dir = "file://migrations"
}

schema {
src = "file://schema.hcl"
# Or: src = data.external_schema.<name>.url
# Or: src = getenv("DATABASE_URL")
}
}
```

### 4. Migration Linting

```bash
atlas migrate lint --env <name> --latest 1
atlas migrate lint --env <name> --latest 3
atlas migrate lint --env ci
```

**Linting configuration:**
```hcl
lint {
destructive {
error = false // Allow destructive changes with warnings
}
}

env "ci" {
lint {
git {
base = "main"
}
}
}
```

To suppress a specific lint error, add `-- atlas:nolint` before the SQL statement.

> **Important:** When fixing migration issues:
> - **Unapplied migrations:** Edit the file, then run `atlas migrate hash --env "<name>"`
> - **Applied migrations:** Never edit directly. Create a new corrective migration instead.
> - **Never use `-- atlas:nolint` without properly fixing the issue or getting user approval.**

### 5. Applying Changes

**Versioned (migration files):**
```bash
atlas migrate apply --env <name> --dry-run # Always preview first
atlas migrate apply --env <name>
```

**Declarative (direct apply — fast local iteration):**
```bash
atlas schema apply --env <name> --dry-run # Preview changes
atlas schema apply --env <name> # Apply directly to database
```

Use `schema apply` for fast edit-apply cycles on a local database without generating migration files. Add `--auto-approve` to skip the confirmation prompt during development.

## Troubleshooting

```bash
atlas version
atlas whoami
atlas migrate hash --env <name>
```

**Missing driver error**: Either `--url` or `--dev-url` is missing or incorrect.

## Key Reminders

1. **Always read `atlas.hcl` first** — use environment names from config
2. **Never hardcode database URLs** — use `getenv()` or secure data sources
3. **Run `atlas schema validate`** after editing schema files
4. **Run `atlas migrate hash`** after manually editing migration files
5. **Use `atlas migrate lint`** to validate migrations before applying
6. **Always use `--dry-run`** before applying migrations
7. **Use `--include`/`--exclude`** to filter tables in schema inspection
8. **Never ask for sensitive information** such as passwords or database URLs
9. **Never ignore linting errors** — fix them or get user approval
10. **Inspect schemas at high level first** — schemas might be very large
11. **Only use atlas commands listed here** — other commands may not be supported
12. **Prefer `atlas schema inspect`** over reading migration files directly

Agent Skills are a modern standard for packaging domain expertise for AI agents. Unlike an instructions file that loads on every conversation, skills activate only when relevant — keeping your context window clean.

Create a skill directory and add the SKILL.md file:

mkdir -p ~/.codex/skills/atlas/references
~/.codex/skills/atlas/SKILL.md
---
name: atlas
description: "Database schema management and migrations with Atlas CLI. Use when: generating migrations, diffing schemas, linting or testing migrations, applying schema changes, inspecting databases, working with atlas.hcl, schema.hcl, or ORM schemas (GORM, Drizzle, SQLAlchemy, Django, Ent, Sequelize, TypeORM), or validating schema definitions."
---

# Atlas Schema Migrations

## Security

Never hardcode credentials. Use environment variables:

```hcl
env "prod" {
url = getenv("DATABASE_URL")
}
```

## Quick Reference

Use `--help` on any command for comprehensive docs and examples:
```bash
atlas migrate diff --help
```

Always use `--env` to reference configurations from `atlas.hcl` — this avoids passing
database credentials to the LLM context.

```bash
# Common
atlas schema inspect --env <name> # Inspect schema
atlas schema validate --env <name> # Validate schema syntax/semantics
atlas schema diff --env <name> # Compare schemas
atlas schema lint --env <name> # Check schema policies
atlas schema test --env <name> # Test schema

# Declarative workflow
atlas schema plan --env <name> # Plan schema changes
atlas schema apply --env <name> --dry-run # Preview changes
atlas schema apply --env <name> # Apply schema changes

# Versioned workflow
atlas migrate diff --env <name> "migration_name" # Generate migration
atlas migrate lint --env <name> --latest 1 # Validate migration
atlas migrate test --env <name> # Test migration
atlas migrate apply --env <name> --dry-run # Preview changes
atlas migrate apply --env <name> # Apply migration
atlas migrate status --env <name> # Check status
```

## Choosing a Workflow

```
Schema change needed
├─ Project has migrations/ dir or migration config in atlas.hcl?
│ ├─ Yes → Versioned: migrate diff → lint → test → apply
│ └─ No → Declarative: schema apply --dry-run → apply
├─ Iterating on local database?
│ └─ Use schema apply --auto-approve for fast edit-apply cycles
└─ Not sure → Read atlas.hcl first
```

**Tip:** `atlas schema apply` applies schema changes directly to a local database without generating migration files. This is useful for fast iteration during development — edit the schema, run `schema apply`, and see the result immediately.

## Example

```
User: Add an email column to the users table

Agent steps:
1. atlas schema inspect --env dev # understand current state
2. Edit schema source file # add email column
3. atlas schema validate --env dev # verify syntax
4. atlas migrate diff --env dev "add_email" # generate migration
5. atlas migrate lint --env dev --latest 1 # check for issues
6. atlas migrate apply --env dev --dry-run # preview before applying
```

## Core Concepts

### Configuration File (atlas.hcl)

Always read the project's `atlas.hcl` first — it contains environment configurations:

```hcl
env "<name>" {
url = getenv("DATABASE_URL")
dev = "docker://postgres/15/dev?search_path=public"

migration {
dir = "file://migrations"
}

schema {
src = "file://schema.hcl"
}
}
```

### Dev Database

Atlas uses a temporary "dev-database" to process and validate schemas. The URL format depends on whether you work with a **single schema** or **multiple schemas**:

```bash
# Schema-scoped (single schema — most common)
--dev-url "docker://mysql/8/dev"
--dev-url "docker://postgres/15/dev?search_path=public"
--dev-url "sqlite://dev?mode=memory"
--dev-url "docker://sqlserver/2022-latest/dev?mode=schema"

# Database-scoped (multiple schemas, extensions, or event triggers)
--dev-url "docker://mysql/8"
--dev-url "docker://postgres/15/dev"
--dev-url "docker://sqlserver/2022-latest/dev?mode=database"
```

**Important:** Using the wrong scope causes errors (`ModifySchema is not allowed`) or silently drops database-level objects (extensions, event triggers) from migrations. Match the dev URL scope to the project's target database URL. For PostGIS or pgvector schemas, use `docker://postgis/latest/dev` or `docker://pgvector/pg17/dev`.

If the schema depends on extensions or external objects, use a `docker` block with a `baseline`:
```hcl
docker "postgres" "dev" {
image = "postgres:15"
schema = "public"
baseline = <<SQL
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
SQL
}

env "local" {
src = "file://schema.hcl"
dev = docker.postgres.dev.url
}
```

## Workflows

### 1. Schema Inspection

Start with a high-level overview before diving into details. The default output is HCL.
Use `--format "{{ json . }}"` for JSON or `--format "{{ sql . }}"` for SQL.

```bash
# List tables (overview first, JSON output)
atlas schema inspect --env <name> --format "{{ json . }}" | jq ".schemas[].tables[].name"

# Full SQL schema
atlas schema inspect --env <name> --format "{{ sql . }}"

# Filter with --include/--exclude (useful for large schemas)
atlas schema inspect --env <name> --include "users_*" # Only matching tables
atlas schema inspect --env <name> --exclude "*_backup" # Skip matching tables
atlas schema inspect --env <name> --exclude "*[type=trigger]" # Skip triggers

# Open visual ERD in browser (requires atlas login)
atlas schema inspect --env <name> -w
```

### 2. Schema Comparison (Diff)

Compare any two schema states:

```bash
# Compare current state to desired schema
atlas schema diff --env <name>

# Compare specific sources
atlas schema diff --env <name> --from file://migrations --to file://schema.hcl
```

### 3. Migration Generation

Generate migrations from schema changes:

```bash
# Generate migration from schema diff
atlas migrate diff --env <name> "add_users_table"

# With explicit parameters
atlas migrate diff \
--dir file://migrations \
--dev-url docker://postgres/15/dev \
--to file://schema.hcl \
"add_users_table"
```

### 4. Schema Validation

Validate schema definitions before generating migrations:

```bash
# Validate schema syntax and semantics
atlas schema validate --env <name>

# Validate against dev database
atlas schema validate --dev-url docker://postgres/15/dev --url file://schema.hcl
```

If valid, exits successfully. If invalid, prints detailed error (unresolved references, syntax issues, unsupported attributes).

### 5. Migration Linting

```bash
atlas migrate lint --env <name> --latest 1 # Lint latest migration
atlas migrate lint --env ci # Lint since git branch
atlas schema lint --env <name> # Check schema policies
```

Fixing lint issues:
- Unapplied migrations: Edit file, then `atlas migrate hash --env <name>`
- Applied migrations: Create corrective migration (never edit directly)

### 6. Migration Testing

```bash
atlas migrate test --env <name> # Requires atlas login
atlas whoami # Check login status first
```

### 7. Applying Migrations

```bash
atlas migrate apply --env <name> --dry-run # Always preview first
atlas migrate apply --env <name> # Apply
atlas migrate status --env <name> # Verify
```

## Standard Workflow

1. `atlas schema inspect --env <name>` — Understand current state
2. Edit schema files
3. `atlas schema validate --env <name>` — Check syntax
4. `atlas migrate diff --env <name> "change_name"` — Generate migration
5. `atlas migrate lint --env <name> --latest 1` — Validate
6. `atlas migrate test --env <name>` — Test (requires login)
7. If issues: edit migration, then `atlas migrate hash`
8. `atlas migrate apply --env <name> --dry-run` then apply

## Schema Sources

For HCL schemas, ORM integrations (GORM, Drizzle, SQLAlchemy, Django, Ent, Sequelize, TypeORM),
composite schemas, and dev-database dialect URLs, see `references/schema-sources.md`.

## Onboarding an Existing Project

### Baseline an existing database

To start managing an existing database with versioned migrations:

```bash
# 1. Export current schema to code
atlas schema inspect -u '<database-url>' --format '{{ sql . | split | write "src" }}'

# 2. Generate a baseline migration from the exported schema
atlas migrate diff "baseline" --to "file://src" --dev-url '<dev-url>'

# 3. Mark baseline as applied on existing databases (use version from filename)
atlas migrate apply --url '<database-url>' --baseline '<version>'
```

The baseline migration captures the current state without executing it on existing databases.
On new databases, it runs in full to create the initial schema.

## Troubleshooting

```bash
# Check installation and login
atlas version
atlas whoami

# Repair migration integrity after manual edits
atlas migrate hash --env <name>
```

**Missing driver error**: Ensure `--url` or `--dev-url` is correctly specified.

## Key Rules

1. Read `atlas.hcl` first — use environment names from config
2. Never hardcode credentials — use `getenv()`
3. Run `atlas schema validate` after schema edits
4. Always lint before applying migrations
5. Always dry-run before applying
6. Run `atlas migrate hash` after editing migration files
7. Use `atlas login` to unlock views, triggers, functions, ERD, and migration testing
8. Write migration tests for data migrations
9. Never ignore lint errors — fix them or get user approval

## Documentation

- [CLI Reference](https://atlasgo.io/cli-reference)
- [Versioned Migrations](https://atlasgo.io/versioned/diff)
- [Declarative Workflow](https://atlasgo.io/declarative/apply)
- [Migration Linting](https://atlasgo.io/versioned/lint)
- [Migration Testing](https://atlasgo.io/testing/migrate)
- [Onboard Existing Database](https://atlasgo.io/versioned/import)
- [ORM Integrations](https://atlasgo.io/guides/orms)
- [Dev Database](https://atlasgo.io/concepts/dev-database)
~/.codex/skills/atlas/references/schema-sources.md
# Schema Sources Reference

## HCL Schema

```hcl
data "hcl_schema" "<name>" {
path = "schema.hcl"
}

env "<name>" {
schema {
src = data.hcl_schema.<name>.url
}
}
```

## External Schema (ORM Integration)

The `external_schema` data source imports SQL schema from an ORM or external program.

```hcl
# GORM (Go)
data "external_schema" "gorm" {
program = ["go", "run", "-mod=mod", "ariga.io/atlas-provider-gorm", "load", "--path", "./models", "--dialect", "postgres"]
}

# Drizzle (TypeScript)
data "external_schema" "drizzle" {
program = ["npx", "drizzle-kit", "export"]
}

# SQLAlchemy (Python)
data "external_schema" "sqlalchemy" {
program = ["python", "-m", "atlas_provider_sqlalchemy", "--path", "./models", "--dialect", "postgresql"]
}

# Django (Python)
data "external_schema" "django" {
program = ["python", "manage.py", "atlas-provider-django", "--dialect", "postgresql"]
}

# Ent (Go)
env "<name>" {
schema {
src = "ent://ent/schema"
}
}

# Sequelize (Node.js)
data "external_schema" "sequelize" {
program = ["npx", "@ariga/atlas-provider-sequelize", "load", "--path", "./models", "--dialect", "postgres"]
}

# TypeORM (TypeScript)
data "external_schema" "typeorm" {
program = ["npx", "@ariga/atlas-provider-typeorm", "load", "--path", "./entities", "--dialect", "postgres"]
}
```

Wire into an environment:
```hcl
env "<name>" {
schema {
src = data.external_schema.<orm>.url
}
}
```

## Composite Schema (Pro)

Combine multiple schema sources into one:

```hcl
data "composite_schema" "app" {
schema "users" {
url = data.external_schema.auth_service.url
}
schema "graph" {
url = "ent://ent/schema"
}
schema "shared" {
url = "file://schema/shared.hcl"
}
}
```

## Dev-Database Dialects

The dev URL format depends on whether your project uses **schema-scoped** or **database-scoped** migrations. Getting this wrong causes errors like `ModifySchema is not allowed` or silently drops database-level objects (extensions, event triggers) from migrations.

**Schema-scoped** (single schema — most common): include the database name and schema scope so Atlas creates objects in the correct schema. Use this when all tables live in one schema (e.g., `public`).

| Dialect | Dev URL (schema-scoped) |
|------------|------------------------------------------------------|
| MySQL | `docker://mysql/8/dev` |
| MariaDB | `docker://maria/latest/dev` |
| PostgreSQL | `docker://postgres/17/dev?search_path=public` |
| SQLite | `sqlite://dev?mode=memory` |
| SQL Server | `docker://sqlserver/2022-latest/dev?mode=schema` |
| ClickHouse | `docker://clickhouse/23.11/dev` |

**Database-scoped** (multiple schemas or database-level objects): omit the schema scope so Atlas can manage multiple schemas and detect database-level objects like extensions and event triggers.

| Dialect | Dev URL (database-scoped) |
|------------|------------------------------------------------------|
| MySQL | `docker://mysql/8` |
| MariaDB | `docker://maria/latest` |
| PostgreSQL | `docker://postgres/17/dev` |
| SQL Server | `docker://sqlserver/2022-latest/dev?mode=database` |
| ClickHouse | `docker://clickhouse/23.11` |

**PostgreSQL with extensions** — use PostGIS or pgvector images when the schema uses those extensions:
```
docker://postgis/latest/dev?search_path=public
docker://pgvector/pg17/dev?search_path=public
```

**How to choose:** Check the project's `atlas.hcl` or target database URL. If it includes `search_path=public` (Postgres) or a specific database name (MySQL), use schema-scoped. If the project manages multiple schemas, extensions, or event triggers, use database-scoped.

See https://atlasgo.io/concepts/dev-database for additional drivers and options.

Codex will automatically load the skill when database operations are requested.

The skill includes:

  • Decision tree for choosing Declarative vs Versioned workflows
  • Step-by-step workflows for inspect → diff → lint → validate → test → apply
  • atlas schema validate for verifying schema correctness after AI-generated edits
  • ORM integration references (GORM, Drizzle, SQLAlchemy, Django, Ent, Sequelize, TypeORM)
  • Security best practices (never hardcode credentials)
  • Troubleshooting guides

See the Agent Skills page for more details.