Bitbucket Pipes
Atlas provides seamless integration with Bitbucket Pipelines, allowing you to manage and apply database migrations directly from your Bitbucket repository. By leveraging Bitbucket Pipelines, you can automate the deployment of migration directories to your target databases, ensuring that your database schema is always up-to-date with your application code.
This guide will walk you through the steps to set up and use Atlas-Action with Bitbucket Pipelines, enabling you to deploy migration directories from your git repository effortlessly.
migrate/apply
Run migrations on a target database using migrate apply
Usage
Add bitbucket-pipelines.yml
to your repo with the following contents:
Deploy a directory from the filesystem
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Applies a migration directory to a target database"
script:
- name: "Migrate Apply"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/apply" # Required
ATLAS_INPUT_URL: ${DATABASE_URL}
ATLAS_INPUT_DIR: "file://migrations"
- source .atlas-action/outputs.sh
Deploy a directory from the schema registry
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Applies a migration directory to a target database"
script:
- name: "Migrate Apply"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/apply" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
ATLAS_INPUT_URL: ${DATABASE_URL}
ATLAS_INPUT_DIR: "atlas://my-project"
- source .atlas-action/outputs.sh
Inputs
ATLAS_ACTION
- (Required) always ismigrate/apply
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_ALLOW_DIRTY
- (Optional) Allow start working on a non-clean database.ATLAS_INPUT_DIR
- (Optional) The URL of the migration directory to apply. For example:atlas://dir-name
for cloud based directories orfile://migrations
for local ones.ATLAS_INPUT_DRY_RUN
- (Optional) Print SQL without executing it. Either "true" or "false".ATLAS_INPUT_REVISIONS_SCHEMA
- (Optional) The name of the schema containing the revisions table.ATLAS_INPUT_URL
- (Optional) The URL of the target database. For example:mysql://root:pass@localhost:3306/dev
.ATLAS_INPUT_WORKING_DIRECTORY
- (Optional) Atlas working directory, default is project rootATLAS_INPUT_CONFIG
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.ATLAS_INPUT_ENV
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.ATLAS_INPUT_VARS
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.
Outputs
The outputs are written into the .atlas-action/outputs.sh
, we can load it for the next step using the source
command.
ATLAS_OUTPUT_MIGRATE_APPLY_APPLIED_COUNT
- The number of migrations that were applied.ATLAS_OUTPUT_MIGRATE_APPLY_CURRENT
- The current version of the database. (before applying migrations)ATLAS_OUTPUT_MIGRATE_APPLY_PENDING_COUNT
- The number of migrations that will be applied.ATLAS_OUTPUT_MIGRATE_APPLY_TARGET
- The target version of the database.
migrate/down
Reverts deployed migration files from a target database using migrate down
Inputs
ATLAS_ACTION
- (Required) always ismigrate/down
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_AMOUNT
- (Optional) The amount of applied migrations to revert. Mutually exclusive withto-tag
andto-version
.ATLAS_INPUT_DIR
- (Optional) The URL of the migration directory to apply. For example:atlas://dir-name
for cloud based directories orfile://migrations
for local ones.ATLAS_INPUT_REVISIONS_SCHEMA
- (Optional) The name of the schema containing the revisions table.ATLAS_INPUT_TO_TAG
- (Optional) The tag to revert to. Mutually exclusive withamount
andto-version
.ATLAS_INPUT_TO_VERSION
- (Optional) The version to revert to. Mutually exclusive withamount
andto-tag
.ATLAS_INPUT_URL
- (Optional) The URL of the target database. For example:mysql://root:pass@localhost:3306/dev
.ATLAS_INPUT_WAIT_INTERVAL
- (Optional) Time in seconds between different migrate down attempts.ATLAS_INPUT_WAIT_TIMEOUT
- (Optional) Time after which no other retry attempt is made and the action exits.ATLAS_INPUT_WORKING_DIRECTORY
- (Optional) Atlas working directory, default is project root.ATLAS_INPUT_CONFIG
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.ATLAS_INPUT_ENV
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.ATLAS_INPUT_VARS
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.ATLAS_INPUT_DEV_URL
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The outputs are written into the .atlas-action/outputs.sh
, we can load it for the next step using the source
command.
ATLAS_OUTPUT_MIGRATE_DOWN_CURRENT
- The current version of the database. (before applying migrations)ATLAS_OUTPUT_MIGRATE_DOWN_PLANNED_COUNT
- The number of migrations that will be applied.ATLAS_OUTPUT_MIGRATE_DOWN_REVERTED_COUNT
- The number of migrations that were reverted.ATLAS_OUTPUT_MIGRATE_DOWN_TARGET
- The target version of the database.ATLAS_OUTPUT_MIGRATE_DOWN_URL
- If given, the URL for reviewing the revert plan.
migrate/lint
Verify migration safety using migration linting.
Usage
Add bitbucket-pipelines.yml
to your repo with the following contents:
- MySQL
- Postgres
- MariaDB
- SQL Server
- ClickHouse
- SQLite
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "CI for database schema changes with Atlas"
script:
- name: "Migrate Lint"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/lint" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://mysql/8/dev"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "CI for database schema changes with Atlas"
script:
- name: "Migrate Lint"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/lint" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://postgres/15/dev?search_path=public"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "CI for database schema changes with Atlas"
script:
- name: "Migrate Lint"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/lint" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://maria/latest/schema"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "CI for database schema changes with Atlas"
script:
- name: "Migrate Lint"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/lint" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://sqlserver/2022-latest?mode=schema"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "CI for database schema changes with Atlas"
script:
- name: "Migrate Lint"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/lint" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://clickhouse/23.11/dev"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "CI for database schema changes with Atlas"
script:
- name: "Migrate Lint"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/lint" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "sqlite://db?mode=memory"
- source .atlas-action/outputs.sh
Inputs
ATLAS_ACTION
- (Required) always ismigrate/lint
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_DIR
- (Optional) The URL of the migration directory to lint. For example:file://migrations
. Read more about Atlas URLs.ATLAS_INPUT_DIR_NAME
- The name (slug) of the project in Atlas Cloud.ATLAS_INPUT_TAG
- (Optional) The tag of migrations to used as base for linting. By default, thelatest
tag is used.ATLAS_INPUT_WORKING_DIRECTORY
- (Optional) Atlas working directory, default is project rootATLAS_INPUT_CONFIG
- (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.ATLAS_INPUT_ENV
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.ATLAS_INPUT_VARS
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.ATLAS_INPUT_DEV_URL
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The outputs are written into the .atlas-action/outputs.sh
, we can load it for the next step using the source
command.
ATLAS_OUTPUT_MIGRATE_LINT_URL
- The URL of the CI report in Atlas Cloud, containing an ERD visualization and analysis of the schema migrations.
migrate/push
Push the current version of your migration directory to the schema registry
Usage
Add bitbucket-pipelines.yml
to your repo with the following contents:
- MySQL
- Postgres
- MariaDB
- SQL Server
- ClickHouse
- SQLite
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Push the current version of your migration directory to Atlas Cloud."
script:
- name: "Migrate Push"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/push" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://mysql/8/dev"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Push the current version of your migration directory to Atlas Cloud."
script:
- name: "Migrate Push"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/push" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://postgres/15/dev?search_path=public"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Push the current version of your migration directory to Atlas Cloud."
script:
- name: "Migrate Push"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/push" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://maria/latest/schema"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Push the current version of your migration directory to Atlas Cloud."
script:
- name: "Migrate Push"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/push" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://sqlserver/2022-latest?mode=schema"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Push the current version of your migration directory to Atlas Cloud."
script:
- name: "Migrate Push"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/push" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://clickhouse/23.11/dev"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Push the current version of your migration directory to Atlas Cloud."
script:
- name: "Migrate Push"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/push" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "sqlite://db?mode=memory"
- source .atlas-action/outputs.sh
Inputs
ATLAS_ACTION
- (Required) always ismigrate/push
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_DIR
- (Optional) The URL of the migration directory to push. For example:file://migrations
. Read more about Atlas URLs.ATLAS_INPUT_DIR_NAME
- (Optional) The name (slug) of the project in Atlas Cloud.ATLAS_INPUT_LATEST
- (Optional) If true, push also to the "latest" tag.ATLAS_INPUT_TAG
- (Optional) The tag to apply to the pushed migration directory. By default the current git commit hash is used.ATLAS_INPUT_WORKING_DIRECTORY
- (Optional) Atlas working directory, default is project rootATLAS_INPUT_CONFIG
- (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.ATLAS_INPUT_ENV
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.ATLAS_INPUT_VARS
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.ATLAS_INPUT_DEV_URL
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
migrate/test
Run migration testing in CI.
Inputs
ATLAS_ACTION
- (Required) always ismigrate/test
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_DIR
- (Optional) The URL of the migration directory to apply. For example:atlas://dir-name
for cloud based directories orfile://migrations
for local ones.ATLAS_INPUT_REVISIONS_SCHEMA
- (Optional) The name of the schema containing the revisions table.ATLAS_INPUT_RUN
- (Optional) Filter tests to run by regexp. For example,^test_.*
will only run tests that start withtest_
. Default is to run all tests.ATLAS_INPUT_WORKING_DIRECTORY
- (Optional) Atlas working directory, default is project rootATLAS_INPUT_CONFIG
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.ATLAS_INPUT_ENV
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.ATLAS_INPUT_VARS
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.ATLAS_INPUT_DEV_URL
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
monitor/schema
Run schema monitoring for a target database, pushes the schema to the schema registry.
Inputs
ATLAS_ACTION
- (Required) always ismonitor/schema
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_CLOUD_TOKEN
- The token that is used to connect to Atlas Cloud (should be passed as a secret).ATLAS_INPUT_EXCLUDE
- (Optional) List of exclude patterns from inspection. see: https://atlasgo.io/declarative/inspect#exclude-schemasATLAS_INPUT_SCHEMAS
- (Optional) List of database schemas to include (by default includes all schemas). see: https://atlasgo.io/declarative/inspect#inspect-multiple-schemasATLAS_INPUT_SLUG
- (Optional) Optional unique identifier for the database server.ATLAS_INPUT_URL
- URL of the database to sync.
Outputs
The outputs are written into the .atlas-action/outputs.sh
, we can load it for the next step using the source
command.
ATLAS_OUTPUT_MONITOR_SCHEMA_URL
- URL of the schema of the database inside Atlas Cloud.
schema/apply
Apply the desired schema to the target database using declarative migrations.
Usage
Add bitbucket-pipelines.yml
to your repo with the following contents:
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Applies schema changes to a target database"
script:
- name: "Schema Apply"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/apply" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN} # Needed only for deploying a schema from Atlas Cloud.
- source .atlas-action/outputs.sh
Inputs
ATLAS_ACTION
- (Required) always isschema/apply
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_AUTO_APPROVE
- (Optional) Automatically approve and apply changes. Either "true" or "false".ATLAS_INPUT_DRY_RUN
- (Optional) Print SQL without executing it. Either "true" or "false".ATLAS_INPUT_PLAN
- (Optional) The plan to apply. For example,atlas://<schema>/plans/<id>
.ATLAS_INPUT_SCHEMA
- (Optional) List of database schema(s). For example:public
.ATLAS_INPUT_TO
- (Optional) URL(s) of the desired schema state.ATLAS_INPUT_URL
- (Optional) The URL of the target database to apply changes to. For example:mysql://root:pass@localhost:3306/prod
.ATLAS_INPUT_WORKING_DIRECTORY
- (Optional) Atlas working directory, default is project rootATLAS_INPUT_CONFIG
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.ATLAS_INPUT_ENV
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.ATLAS_INPUT_VARS
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.ATLAS_INPUT_DEV_URL
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The outputs are written into the .atlas-action/outputs.sh
, we can load it for the next step using the source
command.
ATLAS_OUTPUT_SCHEMA_APPLY_ERROR
- The error message if the action fails.
schema/plan
Plan a declarative migration to move from the current state to the desired state using schema plan
Usage
Add bitbucket-pipelines.yml
to your repo with the following contents:
- MySQL
- Postgres
- MariaDB
- SQL Server
- ClickHouse
- SQLite
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Plan a declarative migration to move from the current state to the desired state"
script:
- name: "Schema Plan"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/plan" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DEV_URL: "docker://mysql/8/dev"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Plan a declarative migration to move from the current state to the desired state"
script:
- name: "Schema Plan"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/plan" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DEV_URL: "docker://postgres/15/dev?search_path=public"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Plan a declarative migration to move from the current state to the desired state"
script:
- name: "Schema Plan"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/plan" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DEV_URL: "docker://maria/latest/schema"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Plan a declarative migration to move from the current state to the desired state"
script:
- name: "Schema Plan"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/plan" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DEV_URL: "docker://sqlserver/2022-latest?mode=schema"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Plan a declarative migration to move from the current state to the desired state"
script:
- name: "Schema Plan"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/plan" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DEV_URL: "docker://clickhouse/23.11/dev"
- source .atlas-action/outputs.sh
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Plan a declarative migration to move from the current state to the desired state"
script:
- name: "Schema Plan"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/plan" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DEV_URL: "sqlite://db?mode=memory"
- source .atlas-action/outputs.sh
Inputs
ATLAS_ACTION
- (Required) always isschema/plan
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_FROM
- (Optional) URL(s) of the current schema state.ATLAS_INPUT_NAME
- (Optional) The name of the plan. By default, Atlas will generate a name based on the schema changes.ATLAS_INPUT_SCHEMA
- (Optional) List of database schema(s). For example:public
.ATLAS_INPUT_SCHEMA_NAME
- (Optional) The name (slug) of the project in Atlas Cloud.ATLAS_INPUT_TO
- (Optional) URL(s) of the desired schema state.ATLAS_INPUT_WORKING_DIRECTORY
- (Optional) Atlas working directory, default is project rootATLAS_INPUT_CONFIG
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.ATLAS_INPUT_ENV
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.ATLAS_INPUT_VARS
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.ATLAS_INPUT_DEV_URL
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The outputs are written into the .atlas-action/outputs.sh
, we can load it for the next step using the source
command.
ATLAS_OUTPUT_SCHEMA_PLAN_LINK
- Link to the schema plan on Atlas.ATLAS_OUTPUT_SCHEMA_PLAN_PLAN
- The plan to be applied or generated. (ig.atlas://<schema>/plans/<id>
)ATLAS_OUTPUT_SCHEMA_PLAN_STATUS
- The status of the plan. For example,PENDING
orAPPROVED
.
schema/plan/approve
Approve a migration plan by its URL
Usage
Add bitbucket-pipelines.yml
to your repo with the following contents:
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Approve a migration plan by its URL"
script:
- name: "Schema Plan Approve"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/plan/approve" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
ATLAS_INPUT_ENV: "ci"
- source .atlas-action/outputs.sh
Inputs
ATLAS_ACTION
- (Required) always isschema/plan/approve
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_FROM
- (Optional) URL(s) of the current schema state.ATLAS_INPUT_PLAN
- (Optional) The URL of the plan to be approved. For example,atlas://<schema>/plans/<id>
. If not provided, Atlas will search the registry for a plan corresponding to the given schema transition and approve it (typically, this plan is created during the PR stage). If multiple plans are found, an error will be thrown.ATLAS_INPUT_SCHEMA_NAME
- (Optional) The name (slug) of the project in Atlas Cloud.ATLAS_INPUT_TO
- (Optional) URL(s) of the desired schema state.ATLAS_INPUT_WORKING_DIRECTORY
- (Optional) Atlas working directory, default is project rootATLAS_INPUT_CONFIG
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.ATLAS_INPUT_ENV
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.ATLAS_INPUT_VARS
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.ATLAS_INPUT_DEV_URL
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The outputs are written into the .atlas-action/outputs.sh
, we can load it for the next step using the source
command.
ATLAS_OUTPUT_SCHEMA_PLAN_APPROVE_LINK
- Link to the schema plan on Atlas.ATLAS_OUTPUT_SCHEMA_PLAN_APPROVE_PLAN
- The plan to be applied or generated. (ig.atlas://<schema>/plans/<id>
)ATLAS_OUTPUT_SCHEMA_PLAN_APPROVE_STATUS
- The status of the plan. (ig.PENDING
,APPROVED
)
schema/push
Push a schema version with an optional tag to Atlas
Usage
Add bitbucket-pipelines.yml
to your repo with the following contents:
image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Push a schema version with an optional tag to Atlas"
script:
- name: "Schema Push"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/push" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
ATLAS_INPUT_ENV: "ci"
ATLAS_INPUT_LATEST: "true"
- source .atlas-action/outputs.sh
Inputs
ATLAS_ACTION
- (Required) always isschema/push
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_DESCRIPTION
- (Optional) The description of the schema.ATLAS_INPUT_LATEST
- (Optional) If true, push also to the "latest" tag.ATLAS_INPUT_SCHEMA
- (Optional) List of database schema(s). For example:public
.ATLAS_INPUT_SCHEMA_NAME
- (Optional) The name (slug) of the schema repository in Atlas Registry. Read more in Atlas website: https://atlasgo.io/registry.ATLAS_INPUT_TAG
- (Optional) The tag to apply to the pushed schema. By default, the current git commit hash is used.ATLAS_INPUT_URL
- (Optional) Desired schema URL(s) to push. For example:file://schema.lt.hcl
.ATLAS_INPUT_VERSION
- (Optional) The version of the schema.ATLAS_INPUT_WORKING_DIRECTORY
- (Optional) Atlas working directory, default is project rootATLAS_INPUT_CONFIG
- (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.ATLAS_INPUT_ENV
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.ATLAS_INPUT_VARS
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.ATLAS_INPUT_DEV_URL
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.
Outputs
The outputs are written into the .atlas-action/outputs.sh
, we can load it for the next step using the source
command.
ATLAS_OUTPUT_SCHEMA_PUSH_LINK
- Link to the schema version on Atlas.ATLAS_OUTPUT_SCHEMA_PUSH_SLUG
- The slug of the pushed schema version.ATLAS_OUTPUT_SCHEMA_PUSH_URL
- The URL of the pushed schema version.
schema/test
Run schema tests against the desired schema
Inputs
ATLAS_ACTION
- (Required) always isschema/test
.ATLAS_TOKEN
- (Optional) to authenticate with Atlas Cloud.BITBUCKET_ACTION_TOKEN
- (Optional) Bitbucket access token to post comment on the PR.ATLAS_INPUT_RUN
- (Optional) Filter tests to run by regexp. For example,^test_.*
will only run tests that start withtest_
. Default is to run all tests.ATLAS_INPUT_URL
- (Optional) The desired schema URL(s) to testATLAS_INPUT_WORKING_DIRECTORY
- (Optional) Atlas working directory, default is project rootATLAS_INPUT_CONFIG
- (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file namedatlas.hcl
in the current directory. For example,file://config/atlas.hcl
. Learn more about Atlas configuration files.ATLAS_INPUT_ENV
- (Optional) The environment to use from the Atlas configuration file. For example,dev
.ATLAS_INPUT_VARS
- (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example,{"var1": "value1", "var2": "value2"}
.ATLAS_INPUT_DEV_URL
- (Optional) The URL of the dev-database to use for analysis. For example:mysql://root:pass@localhost:3306/dev
. Read more about dev-databases.