Atlas Python Script Mode for SQLAlchemy
Python Script Mode
In Python Script Mode, Atlas can be used as a Python script to load your SQLAlchemy models. This mode is useful when your models are spread across different files, or if you want to have more control over the execution environment.
Installation
- macOS + Linux
- Homebrew
- Docker
- Windows
- Manual Installation
To download and install the latest release of the Atlas CLI, simply run the following in your terminal:
curl -sSf https://atlasgo.sh | sh
Get the latest release with Homebrew:
brew install ariga/tap/atlas
To pull the Atlas image and run it as a Docker container:
docker pull arigaio/atlas
docker run --rm arigaio/atlas --help
If the container needs access to the host network or a local directory, use the --net=host
flag and mount the desired
directory:
docker run --rm --net=host \
-v $(pwd)/migrations:/migrations \
arigaio/atlas migrate apply
--url "mysql://root:pass@:3306/test"
Download the latest release and move the atlas binary to a file location on your system PATH.
Install the provider by running:
pip install atlas-provider-sqlalchemy
Setup
- Create a file called
load_models.py
where you import your SQLAlchemy models and use the Atlas provider to generate the DDL:
from models import User
from atlas_provider_sqlalchemy.ddl import print_ddl
print_ddl("mysql", [User])
- In your project directory, create a new file named
atlas.hcl
with the following contents:
- PostgreSQL
- MySQL
- MariaDB
- SQLite
- Microsoft SQL Server
data "external_schema" "sqlalchemy" {
program = [
"python3",
"load_models.py"
]
}
env "sqlalchemy" {
src = data.external_schema.sqlalchemy.url
dev = "docker://postgres/16/dev?search_path=public"
migration {
dir = "file://migrations"
}
format {
migrate {
diff = "{{ sql . " " }}"
}
}
}
data "external_schema" "sqlalchemy" {
program = [
"python3",
"load_models.py"
]
}
env "sqlalchemy" {
src = data.external_schema.sqlalchemy.url
dev = "docker://mysql/8/dev"
migration {
dir = "file://migrations"
}
format {
migrate {
diff = "{{ sql . " " }}"
}
}
}
data "external_schema" "sqlalchemy" {
program = [
"python3",
"load_models.py"
]
}
env "sqlalchemy" {
src = data.external_schema.sqlalchemy.url
dev = "docker://mariadb/latest/dev"
migration {
dir = "file://migrations"
}
format {
migrate {
diff = "{{ sql . " " }}"
}
}
}
data "external_schema" "sqlalchemy" {
program = [
"python3",
"load_models.py"
]
}
env "sqlalchemy" {
src = data.external_schema.sqlalchemy.url
dev = "sqlite://dev?mode=memory"
migration {
dir = "file://migrations"
}
format {
migrate {
diff = "{{ sql . " " }}"
}
}
}
data "external_schema" "sqlalchemy" {
program = [
"python3",
"load_models.py"
]
}
env "sqlalchemy" {
src = data.external_schema.sqlalchemy.url
dev = "docker://sqlserver/2022-latest"
migration {
dir = "file://migrations"
}
format {
migrate {
diff = "{{ sql . " " }}"
}
}
}
- Run Migration Planning
After creating the
atlas.hcl
andload_models.py
files, you can use Atlas to plan migrations:
atlas migrate diff --env sqlalchemy
Running this command will generate files similar to this in the migrations
directory:
migrations
|-- 20230918143104.sql
`-- atlas.sum
0 directories, 2 files
Examining the contents of 20230918143104.sql
:
-- Create "user_account" table
CREATE TABLE `user_account` (
`id` int NOT NULL AUTO_INCREMENT,
`name` varchar(30) NOT NULL,
`fullname` varchar(30) NULL,
PRIMARY KEY (`id`)
) CHARSET utf8mb4 COLLATE utf8mb4_0900_ai_ci;
-- Create "address" table
CREATE TABLE `address` (
`id` int NOT NULL AUTO_INCREMENT,
`email_address` varchar(30) NOT NULL,
`user_id` int NOT NULL,
PRIMARY KEY (`id`),
INDEX `user_id` (`user_id`),
CONSTRAINT `address_ibfk_1` FOREIGN KEY (`user_id`) REFERENCES `user_account` (`id`) ON UPDATE NO ACTION ON DELETE NO ACTION
) CHARSET utf8mb4 COLLATE utf8mb4_0900_ai_ci;
Amazing! Atlas automatically generated a migration file that will create the user_account
and address
tables in our database.
Next, alter the user_account
entity to add a new age
property:
fullname: Mapped[Optional[str]] = mapped_column(String(30))
+ age: Mapped[Optional[int]] = mapped_column()
addresses: Mapped[List["Address"]] = relationship(
back_populates="user", cascade="all, delete-orphan"
)
Re-run this command:
atlas migrate diff --env sqlalchemy
Observe a new migration file is generated in the migration directory:
-- Modify "user_account" table
ALTER TABLE `user_account` ADD COLUMN `age` int NULL;
Applying migrations
You can now apply the new schema to your production database by running the following command:
atlas migrate apply --env sqlalchemy --url "YOUR_PRODUCTION_DB_URL"
Next Steps
Now that your project is set up, start by choosing between the two workflows offered by Atlas for generating and planning migrations. Select the one you prefer that works best for you:
-
Declarative Migrations: Set up a Terraform-like workflow where each migration is calculated as the diff between your desired state and the current state of the database.
-
Versioned Migrations: Set up a migration directory for your project, creating a version-controlled source of truth of your database schema.