Skip to main content

Bitbucket Pipes

Atlas provides seamless integration with Bitbucket Pipelines, allowing you to manage and apply database migrations directly from your Bitbucket repository. By leveraging Bitbucket Pipelines, you can automate the deployment of migration directories to your target databases, ensuring that your database schema is always up-to-date with your application code.

This guide will walk you through the steps to set up and use Atlas-Action with Bitbucket Pipelines, enabling you to deploy migration directories from your git repository effortlessly.

migrate/apply

Run migrations on a target database using migrate apply

Usage

Add bitbucket-pipelines.yml to your repo with the following contents:

Deploy a directory from the filesystem

image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Applies a migration directory to a target database"
script:
- name: "Migrate Apply"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/apply" # Required
ATLAS_INPUT_URL: ${DATABASE_URL}
ATLAS_INPUT_DIR: "file://migrations"
- source .atlas-action/outputs.sh

Deploy a directory from the schema registry

image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Applies a migration directory to a target database"
script:
- name: "Migrate Apply"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/apply" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
ATLAS_INPUT_URL: ${DATABASE_URL}
ATLAS_INPUT_DIR: "atlas://my-project"
- source .atlas-action/outputs.sh

Inputs

  • ATLAS_ACTION - (Required) always is migrate/apply.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_ALLOW_DIRTY - (Optional) Allow start working on a non-clean database.
  • ATLAS_INPUT_DIR - (Optional) The URL of the migration directory to apply. For example: atlas://dir-name for cloud based directories or file://migrations for local ones.
  • ATLAS_INPUT_DRY_RUN - (Optional) Print SQL without executing it. Either "true" or "false".
  • ATLAS_INPUT_REVISIONS_SCHEMA - (Optional) The name of the schema containing the revisions table.
  • ATLAS_INPUT_URL - (Optional) The URL of the target database. For example: mysql://root:pass@localhost:3306/dev.
  • ATLAS_INPUT_WORKING_DIRECTORY - (Optional) Atlas working directory, default is project root
  • ATLAS_INPUT_CONFIG - (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file named atlas.hcl in the current directory. For example, file://config/atlas.hcl. Learn more about Atlas configuration files.
  • ATLAS_INPUT_ENV - (Optional) The environment to use from the Atlas configuration file. For example, dev.
  • ATLAS_INPUT_VARS - (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example, {"var1": "value1", "var2": "value2"}.

Outputs

The outputs are written into the .atlas-action/outputs.sh, we can load it for the next step using the source command.

  • ATLAS_OUTPUT_MIGRATE_APPLY_APPLIED_COUNT - The number of migrations that were applied.
  • ATLAS_OUTPUT_MIGRATE_APPLY_CURRENT - The current version of the database. (before applying migrations)
  • ATLAS_OUTPUT_MIGRATE_APPLY_PENDING_COUNT - The number of migrations that will be applied.
  • ATLAS_OUTPUT_MIGRATE_APPLY_TARGET - The target version of the database.

migrate/down

Reverts deployed migration files from a target database using migrate down

Inputs

  • ATLAS_ACTION - (Required) always is migrate/down.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_AMOUNT - (Optional) The amount of applied migrations to revert. Mutually exclusive with to-tag and to-version.
  • ATLAS_INPUT_DIR - (Optional) The URL of the migration directory to apply. For example: atlas://dir-name for cloud based directories or file://migrations for local ones.
  • ATLAS_INPUT_REVISIONS_SCHEMA - (Optional) The name of the schema containing the revisions table.
  • ATLAS_INPUT_TO_TAG - (Optional) The tag to revert to. Mutually exclusive with amount and to-version.
  • ATLAS_INPUT_TO_VERSION - (Optional) The version to revert to. Mutually exclusive with amount and to-tag.
  • ATLAS_INPUT_URL - (Optional) The URL of the target database. For example: mysql://root:pass@localhost:3306/dev.
  • ATLAS_INPUT_WAIT_INTERVAL - (Optional) Time in seconds between different migrate down attempts.
  • ATLAS_INPUT_WAIT_TIMEOUT - (Optional) Time after which no other retry attempt is made and the action exits.
  • ATLAS_INPUT_WORKING_DIRECTORY - (Optional) Atlas working directory, default is project root.
  • ATLAS_INPUT_CONFIG - (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file named atlas.hcl in the current directory. For example, file://config/atlas.hcl. Learn more about Atlas configuration files.
  • ATLAS_INPUT_ENV - (Optional) The environment to use from the Atlas configuration file. For example, dev.
  • ATLAS_INPUT_VARS - (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example, {"var1": "value1", "var2": "value2"}.
  • ATLAS_INPUT_DEV_URL - (Optional) The URL of the dev-database to use for analysis. For example: mysql://root:pass@localhost:3306/dev. Read more about dev-databases.

Outputs

The outputs are written into the .atlas-action/outputs.sh, we can load it for the next step using the source command.

  • ATLAS_OUTPUT_MIGRATE_DOWN_CURRENT - The current version of the database. (before applying migrations)
  • ATLAS_OUTPUT_MIGRATE_DOWN_PLANNED_COUNT - The number of migrations that will be applied.
  • ATLAS_OUTPUT_MIGRATE_DOWN_REVERTED_COUNT - The number of migrations that were reverted.
  • ATLAS_OUTPUT_MIGRATE_DOWN_TARGET - The target version of the database.
  • ATLAS_OUTPUT_MIGRATE_DOWN_URL - If given, the URL for reviewing the revert plan.

migrate/lint

Verify migration safety using migration linting.

Usage

Add bitbucket-pipelines.yml to your repo with the following contents:

image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "CI for database schema changes with Atlas"
script:
- name: "Migrate Lint"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/lint" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://mysql/8/dev"
- source .atlas-action/outputs.sh

Inputs

  • ATLAS_ACTION - (Required) always is migrate/lint.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_DIR - (Optional) The URL of the migration directory to lint. For example: file://migrations. Read more about Atlas URLs.
  • ATLAS_INPUT_DIR_NAME - The name (slug) of the project in Atlas Cloud.
  • ATLAS_INPUT_TAG - (Optional) The tag of migrations to used as base for linting. By default, the latest tag is used.
  • ATLAS_INPUT_WORKING_DIRECTORY - (Optional) Atlas working directory, default is project root
  • ATLAS_INPUT_CONFIG - (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file named atlas.hcl in the current directory. For example, file://config/atlas.hcl. Learn more about Atlas configuration files.
  • ATLAS_INPUT_ENV - (Optional) The environment to use from the Atlas configuration file. For example, dev.
  • ATLAS_INPUT_VARS - (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example, {"var1": "value1", "var2": "value2"}.
  • ATLAS_INPUT_DEV_URL - (Optional) The URL of the dev-database to use for analysis. For example: mysql://root:pass@localhost:3306/dev. Read more about dev-databases.

Outputs

The outputs are written into the .atlas-action/outputs.sh, we can load it for the next step using the source command.

  • ATLAS_OUTPUT_MIGRATE_LINT_URL - The URL of the CI report in Atlas Cloud, containing an ERD visualization and analysis of the schema migrations.

migrate/push

Push the current version of your migration directory to the schema registry

Usage

Add bitbucket-pipelines.yml to your repo with the following contents:

image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Push the current version of your migration directory to Atlas Cloud."
script:
- name: "Migrate Push"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "migrate/push" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DIR_NAME: "my-project"
ATLAS_INPUT_DEV_URL: "docker://mysql/8/dev"
- source .atlas-action/outputs.sh

Inputs

  • ATLAS_ACTION - (Required) always is migrate/push.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_DIR - (Optional) The URL of the migration directory to push. For example: file://migrations. Read more about Atlas URLs.
  • ATLAS_INPUT_DIR_NAME - (Optional) The name (slug) of the project in Atlas Cloud.
  • ATLAS_INPUT_LATEST - (Optional) If true, push also to the "latest" tag.
  • ATLAS_INPUT_TAG - (Optional) The tag to apply to the pushed migration directory. By default the current git commit hash is used.
  • ATLAS_INPUT_WORKING_DIRECTORY - (Optional) Atlas working directory, default is project root
  • ATLAS_INPUT_CONFIG - (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file named atlas.hcl in the current directory. For example, file://config/atlas.hcl. Learn more about Atlas configuration files.
  • ATLAS_INPUT_ENV - (Optional) The environment to use from the Atlas configuration file. For example, dev.
  • ATLAS_INPUT_VARS - (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example, {"var1": "value1", "var2": "value2"}.
  • ATLAS_INPUT_DEV_URL - (Optional) The URL of the dev-database to use for analysis. For example: mysql://root:pass@localhost:3306/dev. Read more about dev-databases.

migrate/test

Run migration testing in CI.

Inputs

  • ATLAS_ACTION - (Required) always is migrate/test.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_DIR - (Optional) The URL of the migration directory to apply. For example: atlas://dir-name for cloud based directories or file://migrations for local ones.
  • ATLAS_INPUT_REVISIONS_SCHEMA - (Optional) The name of the schema containing the revisions table.
  • ATLAS_INPUT_RUN - (Optional) Filter tests to run by regexp. For example, ^test_.* will only run tests that start with test_. Default is to run all tests.
  • ATLAS_INPUT_WORKING_DIRECTORY - (Optional) Atlas working directory, default is project root
  • ATLAS_INPUT_CONFIG - (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file named atlas.hcl in the current directory. For example, file://config/atlas.hcl. Learn more about Atlas configuration files.
  • ATLAS_INPUT_ENV - (Optional) The environment to use from the Atlas configuration file. For example, dev.
  • ATLAS_INPUT_VARS - (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example, {"var1": "value1", "var2": "value2"}.
  • ATLAS_INPUT_DEV_URL - (Optional) The URL of the dev-database to use for analysis. For example: mysql://root:pass@localhost:3306/dev. Read more about dev-databases.

monitor/schema

Run schema monitoring for a target database, pushes the schema to the schema registry.

Inputs

  • ATLAS_ACTION - (Required) always is monitor/schema.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_CLOUD_TOKEN - The token that is used to connect to Atlas Cloud (should be passed as a secret).
  • ATLAS_INPUT_EXCLUDE - (Optional) List of exclude patterns from inspection. see: https://atlasgo.io/declarative/inspect#exclude-schemas
  • ATLAS_INPUT_SCHEMAS - (Optional) List of database schemas to include (by default includes all schemas). see: https://atlasgo.io/declarative/inspect#inspect-multiple-schemas
  • ATLAS_INPUT_SLUG - (Optional) Optional unique identifier for the database server.
  • ATLAS_INPUT_URL - URL of the database to sync.

Outputs

The outputs are written into the .atlas-action/outputs.sh, we can load it for the next step using the source command.

  • ATLAS_OUTPUT_MONITOR_SCHEMA_URL - URL of the schema of the database inside Atlas Cloud.

schema/apply

Apply the desired schema to the target database using declarative migrations.

Usage

Add bitbucket-pipelines.yml to your repo with the following contents:

image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Applies schema changes to a target database"
script:
- name: "Schema Apply"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/apply" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN} # Needed only for deploying a schema from Atlas Cloud.
- source .atlas-action/outputs.sh

Inputs

  • ATLAS_ACTION - (Required) always is schema/apply.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_AUTO_APPROVE - (Optional) Automatically approve and apply changes. Either "true" or "false".
  • ATLAS_INPUT_DRY_RUN - (Optional) Print SQL without executing it. Either "true" or "false".
  • ATLAS_INPUT_PLAN - (Optional) The plan to apply. For example, atlas://<schema>/plans/<id>.
  • ATLAS_INPUT_SCHEMA - (Optional) List of database schema(s). For example: public.
  • ATLAS_INPUT_TO - (Optional) URL(s) of the desired schema state.
  • ATLAS_INPUT_URL - (Optional) The URL of the target database to apply changes to. For example: mysql://root:pass@localhost:3306/prod.
  • ATLAS_INPUT_WORKING_DIRECTORY - (Optional) Atlas working directory, default is project root
  • ATLAS_INPUT_CONFIG - (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file named atlas.hcl in the current directory. For example, file://config/atlas.hcl. Learn more about Atlas configuration files.
  • ATLAS_INPUT_ENV - (Optional) The environment to use from the Atlas configuration file. For example, dev.
  • ATLAS_INPUT_VARS - (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example, {"var1": "value1", "var2": "value2"}.
  • ATLAS_INPUT_DEV_URL - (Optional) The URL of the dev-database to use for analysis. For example: mysql://root:pass@localhost:3306/dev. Read more about dev-databases.

Outputs

The outputs are written into the .atlas-action/outputs.sh, we can load it for the next step using the source command.

  • ATLAS_OUTPUT_SCHEMA_APPLY_ERROR - The error message if the action fails.

schema/plan

Plan a declarative migration to move from the current state to the desired state using schema plan

Usage

Add bitbucket-pipelines.yml to your repo with the following contents:

image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Plan a declarative migration to move from the current state to the desired state"
script:
- name: "Schema Plan"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/plan" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
BITBUCKET_ACCESS_TOKEN: ${BITBUCKET_ACCESS_TOKEN}
ATLAS_INPUT_DEV_URL: "docker://mysql/8/dev"
- source .atlas-action/outputs.sh

Inputs

  • ATLAS_ACTION - (Required) always is schema/plan.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_FROM - (Optional) URL(s) of the current schema state.
  • ATLAS_INPUT_NAME - (Optional) The name of the plan. By default, Atlas will generate a name based on the schema changes.
  • ATLAS_INPUT_SCHEMA - (Optional) List of database schema(s). For example: public.
  • ATLAS_INPUT_SCHEMA_NAME - (Optional) The name (slug) of the project in Atlas Cloud.
  • ATLAS_INPUT_TO - (Optional) URL(s) of the desired schema state.
  • ATLAS_INPUT_WORKING_DIRECTORY - (Optional) Atlas working directory, default is project root
  • ATLAS_INPUT_CONFIG - (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file named atlas.hcl in the current directory. For example, file://config/atlas.hcl. Learn more about Atlas configuration files.
  • ATLAS_INPUT_ENV - (Optional) The environment to use from the Atlas configuration file. For example, dev.
  • ATLAS_INPUT_VARS - (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example, {"var1": "value1", "var2": "value2"}.
  • ATLAS_INPUT_DEV_URL - (Optional) The URL of the dev-database to use for analysis. For example: mysql://root:pass@localhost:3306/dev. Read more about dev-databases.

Outputs

The outputs are written into the .atlas-action/outputs.sh, we can load it for the next step using the source command.

  • ATLAS_OUTPUT_SCHEMA_PLAN_LINK - Link to the schema plan on Atlas.
  • ATLAS_OUTPUT_SCHEMA_PLAN_PLAN - The plan to be applied or generated. (ig. atlas://<schema>/plans/<id>)
  • ATLAS_OUTPUT_SCHEMA_PLAN_STATUS - The status of the plan. For example, PENDING or APPROVED.

schema/plan/approve

Approve a migration plan by its URL

Usage

Add bitbucket-pipelines.yml to your repo with the following contents:

image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Approve a migration plan by its URL"
script:
- name: "Schema Plan Approve"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/plan/approve" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
ATLAS_INPUT_ENV: "ci"
- source .atlas-action/outputs.sh

Inputs

  • ATLAS_ACTION - (Required) always is schema/plan/approve.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_FROM - (Optional) URL(s) of the current schema state.
  • ATLAS_INPUT_PLAN - (Optional) The URL of the plan to be approved. For example, atlas://<schema>/plans/<id>. If not provided, Atlas will search the registry for a plan corresponding to the given schema transition and approve it (typically, this plan is created during the PR stage). If multiple plans are found, an error will be thrown.
  • ATLAS_INPUT_SCHEMA_NAME - (Optional) The name (slug) of the project in Atlas Cloud.
  • ATLAS_INPUT_TO - (Optional) URL(s) of the desired schema state.
  • ATLAS_INPUT_WORKING_DIRECTORY - (Optional) Atlas working directory, default is project root
  • ATLAS_INPUT_CONFIG - (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file named atlas.hcl in the current directory. For example, file://config/atlas.hcl. Learn more about Atlas configuration files.
  • ATLAS_INPUT_ENV - (Optional) The environment to use from the Atlas configuration file. For example, dev.
  • ATLAS_INPUT_VARS - (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example, {"var1": "value1", "var2": "value2"}.
  • ATLAS_INPUT_DEV_URL - (Optional) The URL of the dev-database to use for analysis. For example: mysql://root:pass@localhost:3306/dev. Read more about dev-databases.

Outputs

The outputs are written into the .atlas-action/outputs.sh, we can load it for the next step using the source command.

  • ATLAS_OUTPUT_SCHEMA_PLAN_APPROVE_LINK - Link to the schema plan on Atlas.
  • ATLAS_OUTPUT_SCHEMA_PLAN_APPROVE_PLAN - The plan to be applied or generated. (ig. atlas://<schema>/plans/<id>)
  • ATLAS_OUTPUT_SCHEMA_PLAN_APPROVE_STATUS - The status of the plan. (ig. PENDING, APPROVED)

schema/push

Push a schema version with an optional tag to Atlas

Usage

Add bitbucket-pipelines.yml to your repo with the following contents:

image: atlassian/default-image:3
pipelines:
branches:
master:
- step:
name: "Push a schema version with an optional tag to Atlas"
script:
- name: "Schema Push"
pipe: docker://arigaio/atlas-action:master
variables:
ATLAS_ACTION: "schema/push" # Required
ATLAS_TOKEN: ${ATLAS_TOKEN}
ATLAS_INPUT_ENV: "ci"
ATLAS_INPUT_LATEST: "true"
- source .atlas-action/outputs.sh

Inputs

  • ATLAS_ACTION - (Required) always is schema/push.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_DESCRIPTION - (Optional) The description of the schema.
  • ATLAS_INPUT_LATEST - (Optional) If true, push also to the "latest" tag.
  • ATLAS_INPUT_SCHEMA - (Optional) List of database schema(s). For example: public.
  • ATLAS_INPUT_SCHEMA_NAME - (Optional) The name (slug) of the schema repository in Atlas Registry. Read more in Atlas website: https://atlasgo.io/registry.
  • ATLAS_INPUT_TAG - (Optional) The tag to apply to the pushed schema. By default, the current git commit hash is used.
  • ATLAS_INPUT_URL - (Optional) Desired schema URL(s) to push. For example: file://schema.lt.hcl.
  • ATLAS_INPUT_VERSION - (Optional) The version of the schema.
  • ATLAS_INPUT_WORKING_DIRECTORY - (Optional) Atlas working directory, default is project root
  • ATLAS_INPUT_CONFIG - (Optional) The path to the Atlas configuration file. By default, Atlas will look for a file named atlas.hcl in the current directory. For example, file://config/atlas.hcl. Learn more about Atlas configuration files.
  • ATLAS_INPUT_ENV - (Optional) The environment to use from the Atlas configuration file. For example, dev.
  • ATLAS_INPUT_VARS - (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example, {"var1": "value1", "var2": "value2"}.
  • ATLAS_INPUT_DEV_URL - (Optional) The URL of the dev-database to use for analysis. For example: mysql://root:pass@localhost:3306/dev. Read more about dev-databases.

Outputs

The outputs are written into the .atlas-action/outputs.sh, we can load it for the next step using the source command.

  • ATLAS_OUTPUT_SCHEMA_PUSH_LINK - Link to the schema version on Atlas.
  • ATLAS_OUTPUT_SCHEMA_PUSH_SLUG - The slug of the pushed schema version.
  • ATLAS_OUTPUT_SCHEMA_PUSH_URL - The URL of the pushed schema version.

schema/test

Run schema tests against the desired schema

Inputs

  • ATLAS_ACTION - (Required) always is schema/test.
  • ATLAS_TOKEN - (Optional) to authenticate with Atlas Cloud.
  • BITBUCKET_ACTION_TOKEN - (Optional) Bitbucket access token to post comment on the PR.
  • ATLAS_INPUT_RUN - (Optional) Filter tests to run by regexp. For example, ^test_.* will only run tests that start with test_. Default is to run all tests.
  • ATLAS_INPUT_URL - (Optional) The desired schema URL(s) to test
  • ATLAS_INPUT_WORKING_DIRECTORY - (Optional) Atlas working directory, default is project root
  • ATLAS_INPUT_CONFIG - (Optional) The URL of the Atlas configuration file. By default, Atlas will look for a file named atlas.hcl in the current directory. For example, file://config/atlas.hcl. Learn more about Atlas configuration files.
  • ATLAS_INPUT_ENV - (Optional) The environment to use from the Atlas configuration file. For example, dev.
  • ATLAS_INPUT_VARS - (Optional) A JSON object containing variables to be used in the Atlas configuration file. For example, {"var1": "value1", "var2": "value2"}.
  • ATLAS_INPUT_DEV_URL - (Optional) The URL of the dev-database to use for analysis. For example: mysql://root:pass@localhost:3306/dev. Read more about dev-databases.