From Manual to Automated Database Schema Migrations
Software teams commonly embrace DevOps for delivery, creating automated CI/CD pipelines that allow for rapid and reliable software delivery. Suprisingly, some of these same teams still manage their database schema manually, causing an interesting contrast.
Picture this: a team spent countless hours ensuring that every change to their application code is:
- Version controlled
- Automatically tested, built, and stored in an Artifact Repository
- Automatically deployed
- Easily rolled back
Yet when it comes to making changes to their database schema, the process looks very different: a developer writes a SQL migration script, connects to the production database with privileged access, runs the script manually, and (if successful) continues with deployment. The entire process is in the hands of the developer.
Projects frequently begin with manual database schema management because it's the easiest option, particularly when databases are small, changes are infrequent, and there are no users. However, as applications evolve and schema migrations grow more complex, this practice becomes a looming risk.
Let's explore the pitfalls of manual migrations, the benefits of automated migrations, and getting started with Atlas to automate your database schema management.
The Pitfalls of Manual Migrations
A number of risks can arise when relying on manual schema migrations that can impact development speed, application stability, and even security.
- Prone to Human Error. Misplacing a comma, choosing the wrong database, or using an old version of a script can lead to inconsistency, data-loss, or significant downtime.
- Time-consuming. Writing and fixing database changes manually is inefficient and uses up time that could be better spent in areas such as feature development or addressing critical bugs.
- Schema Drifting. Ensuring that the exact same database changes are applied consistently across your development, staging, and production environments is made more difficult when relying on manual execution. Discrepancies between these environments can lead to unexpected bugs and destructive integration issues.
- Security Concerns. An often overlooked but critical risk, manual migrations often require developers to have direct access to production databases for routine schema changes. This practice is a significant security vulnerability, increasing the potential for damage.
Enter Atlas
In the world of automated schema migrations, Atlas stands out as a powerful and intelligent tool. It's designed to bring modern schema management and automated migrations to your fingertips, effectively addressing the risks of manual migrations.
Atlas tackles the challenges of manual management through several key features:
- Declarative Schema Definition. Define your database schema as code in a declarative way, not as a set of changes thrown together.
- Automatic Migration Generation. Atlas analyzes the differences between your current database schema and your desired state, automatically generating efficient SQL migration scripts to bridge the gap.
- Safety Checks and Linting. Before executing any migration, Atlas can analyze the generated SQL for potential issues, providing warnings and errors to prevent unintended consequences. This acts as a crucial safety net, catching problems before they reach your database.
- Schema Consistency. Atlas applies migrations directly to your schema and can detect if your database schema has drifted from its intended state, alerting you to inconsistencies and potential problems.
- And much more.
Simply put – Atlas can ease the transition from manual to automatic migrations.
Let's go over a quick example of applying a schema migration automatically with Atlas.
Getting Started with Atlas
Installation
- macOS + Linux
- Homebrew
- Docker
- Windows
- Manual Installation
To download and install the latest release of the Atlas CLI, simply run the following in your terminal:
curl -sSf https://atlasgo.sh | sh
Get the latest release with Homebrew:
brew install ariga/tap/atlas
To pull the Atlas image and run it as a Docker container:
docker pull arigaio/atlas
docker run --rm arigaio/atlas --help
If the container needs access to the host network or a local directory, use the --net=host
flag and mount the desired
directory:
docker run --rm --net=host \
-v $(pwd)/migrations:/migrations \
arigaio/atlas migrate apply
--url "mysql://root:pass@:3306/test"
Download the latest release and move the atlas binary to a file location on your system PATH.
Spin up a local database
Atlas uses a temporary and local dev database to automatically create migration files. I'll be using PostgreSQL in this example. For other databases, please head to our Databases portal.
Spin up a Docker container using the following command configured to your personal permissions:
docker run --rm -d --name atlas-demo -e POSTGRES_PASSWORD=pass -e POSTGRES_DB=demo -p 5432:5432 postgres
Code setup
Beginning with a PostgreSQL schema in the Atlas HCL format:
CREATE TABLE "owners" (
"id" integer NOT NULL,
"name" character varying NOT NULL
);
Apply this schema (--to
) to our empty database (--url
) using the dev database (--dev-url
) by running:
atlas schema apply \
--url "postgres://postgres:pass@localhost:5432/demo?sslmode=disable&search_path=public" \
--to "file://schema.sql" \
--dev-url "docker://postgres/17/dev?search_path=public"
Select Approve and apply
to automatically apply the migration to the database:
Planning migration statements (1 in total):
-- create "owners" table:
-> CREATE TABLE "owners" (
"id" integer NOT NULL,
"name" character varying(255) NOT NULL
);
-------------------------------------------
Analyzing planned statements (1 in total):
-- no diagnostics found
-------------------------
-- 24.449292ms
-- 1 schema change
-------------------------------------------
? Approve or abort the plan:
▸ Approve and apply
Abort
Now let's make some changes to schema.sql
to make id
the primary key of owners
and add a second table with a foreign key:
CREATE TABLE "owners" (
"id" integer NOT NULL,
"name" character varying NOT NULL,
PRIMARY KEY ("id")
);
CREATE TABLE "stores" (
"owner_id" integer NOT NULL,
"name" character varying(255) NOT NULL,
CONSTRAINT "owner_id"
FOREIGN KEY ("owner_id")
REFERENCES "owners" ("id")
);
Run the altas schema apply
command again to generate and apply this migration:
atlas schema apply \
--url "postgres://postgres:pass@localhost:5432/demo?sslmode=disable&search_path=public" \
--to "file://schema.sql" \
--dev-url "docker://postgres/17/dev?search_path=public"
The output shows the SQL statements needed to perform this migration, along with a linting analysis that informs us of potential deployment issues that the primary key constraint can cause:
Planning migration statements (2 in total):
-- modify "owners" table:
-> ALTER TABLE "owners" ADD PRIMARY KEY ("id");
-- create "stores" table:
-> CREATE TABLE "stores" (
"owner_id" integer NOT NULL,
"name" character varying(255) NOT NULL,
CONSTRAINT "owner_id" FOREIGN KEY ("owner_id") REFERENCES "owners" ("id") ON UPDATE NO ACTION ON DELETE NO ACTION
);
-------------------------------------------
Analyzing planned statements (2 in total):
-- concurrent index violations detected:
-- L2: Adding a PRIMARY KEY constraint on table "owners" acquires an ACCESS
EXCLUSIVE lock, blocking both reads and writes during the operation
https://atlasgo.io/lint/analyzers#PG104
-- ok (268.666µs)
-------------------------
-- 48.546875ms
-- 2 schema changes
-- 1 diagnostic
-------------------------------------------
? Approve or abort the plan:
▸ Approve and apply
Abort
To stay on-topic, I will not fix the detected violation. Click here for the solution.
Upon approving the plan, these two SQL statements are applied to the database and the migration has been performed.
It's as simple as that: make the change, plan the migration, and safely apply.
Why Automation Matters
Automating infrastructure management leads to safer and faster deployments, and you can have this peace of mind with your schema migrations, as well.
No More Manual SQL Headaches. Rather than crafting and debugging individual SQL statements, define the desired state of your database schema and let Atlas figure out the necessary steps to achieve that state for you. This abstraction eliminates the risk of syntax errors, forgotten dependencies, and application downtime.
Predictable, Repeatable, Reviewable. Applying the same declarative configuration across different environments (development, staging, production) yields consistent results. These declarative schema definitions can also be version-controlled and reviewed just like any other code, making it possible to catch potential issues before they impact your database.
Safer Pipelines and Faster Iteration. Automating schema migrations within your CI/CD pipelines becomes significantly safer with a declarative approach. The risk of manual errors during deployment is drastically reduced, leading to more stable releases.
Shift Schema Ownership Left. By implementing a declarative approach and appropriate tools, developers gain the ability to manage database schema changes as part of their feature development without needing direct database credentials.
Atlas brings automation to your project, addressing the various limitations of manual migrations:
Manual Limitation | Atlas' Solution |
---|---|
Human error | Takes the current database schema and desired schema as input to automatically plan a safe migration |
Inefficiency | Greatly reduces the time and effort needed to plan, review, execute and rollback changes. |
Schema drift | Declarative migrations continuously reconcile between the current and desired state of the database, eliminating drift. |
Security | Running migrations from pipelines eliminates the need for developers to routinely access production directly. |
Try Atlas Today
By embracing automated schema migrations and leveraging tools like Atlas, you can significantly reduce errors, boost developer productivity, ensure environment consistency, enhance security, and ultimately build more reliable and stable applications.
To get started – follow our "Quick Start" instructions, dive into our comprehensive documentation, and don't hesitate to join our Discord community for support and guidance.