Back to changelog
New
2 minute read

PostgreSQL: User Mapping Support

Atlas now supports PostgreSQL user mappings for foreign servers, enabling secure access to remote databases via Foreign Data Wrappers.

User mappings define per-user credentials for foreign servers, allowing secure access to remote databases via Foreign Data Wrappers (FDW).

Defining User Mappings

The user_mapping block maps a local user to credentials on a foreign server. Options like user and password are passed to the FDW driver:

extension "postgres_fdw" {
schema = schema.public
}
server "s1" {
fdw = extension.postgres_fdw
options = {
dbname = "remote_db"
host = "localhost"
port = "5432"
}
}
user_mapping {
server = server.s1
user = "postgres"
options = {
user = "remote_user"
password = var.remote_password # passwords should come from variables/secret stores
}
}

PUBLIC and Role References

User mappings support keyword users like PUBLIC (applies to all users) as well as references to defined roles:

user_mapping {
server = server.s1
user = PUBLIC
}
role "dba" {
}
user_mapping {
server = server.s1
user = role.dba
options = {
user = "remote_admin"
}
}

Generated SQL

Atlas generates the appropriate CREATE USER MAPPING statements and handles the full lifecycle, including ALTER and DROP:

-- Create extension "postgres_fdw"
CREATE EXTENSION "postgres_fdw" WITH SCHEMA "public";
-- Create server "s1"
CREATE SERVER "s1" FOREIGN DATA WRAPPER "postgres_fdw" OPTIONS (dbname 'remote_db', host 'localhost', port '5432');
-- Create user mapping for "postgres" server "s1"
CREATE USER MAPPING FOR "postgres" SERVER "s1" OPTIONS (user 'remote_user');
-- Create user mapping for "PUBLIC" server "s1"
CREATE USER MAPPING FOR PUBLIC SERVER "s1";

Password Handling

By default, passwords are omitted from generated migration files to prevent leaking credentials into version control. Automatic password management is supported only by the declarative workflow (using atlas schema apply), and only if explicitly enabled. To include passwords in migration planning, set sensitive = ALLOW in your project configuration:

env "local" {
schema {
src = "file://schema.pg.hcl"
mode {
roles = true
sensitive = ALLOW
}
}
}

Instead of hardcoding passwords in schema files, use input variables and inject them from secret stores (AWS Secrets Manager, GCP Secret Manager, HashiCorp Vault) or environment variables:

variable "remote_password" {
type = string
}
user_mapping {
server = server.s1
user = "postgres"
options = {
user = "remote_user"
password = var.remote_password
}
}
featurepostgresfdwuser-mapping