Blog post image for Multi-Environment Secret Management with HashiCorp Vault - Manage secrets securely across dev, staging, and production with HashiCorp Vault. This snippet demonstrates dynamic secret generation, rotation, and cross-environment secret syncing patterns.
Codesnippets

Multi-Environment Secret Management with HashiCorp Vault

Multi-Environment Secret Management with HashiCorp Vault

04 Mins read

Need to manage secrets safely across multiple environments? Here’s how with HashiCorp Vault.

Storing secrets in .env files, hardcoding them, or even using separate secret managers per environment creates security risks and operational chaos. Vault provides a unified, audited solution for secret generation, rotation, and access control across all your environments.

The Problem

Multi-Environment Secret Chaos

When managing dev, staging, and production environments, secrets management becomes a headache:

  • Different secrets per environment with no central tracking
  • Manual rotation processes prone to human error
  • No audit trail for who accessed which secrets and when
  • Secrets leaking through Git history or logs
  • Difficulty revoking compromised credentials instantly

The Solution

HashiCorp Vault Overview

Vault is a secrets management platform that provides:

  • Dynamic secrets: Generate credentials on-demand that auto-expire
  • Secret rotation: Automatically rotate credentials without downtime
  • Audit logging: Complete trail of all secret access and operations
  • Environment isolation: Separate secrets per environment with shared policies
  • Multi-auth methods: Support for AppRole, Kubernetes, IAM, JWT, and more

TL;DR

  • Use HashiCorp Vault as a centralized secret backend
  • Generate dynamic credentials that auto-expire
  • Implement automatic rotation for long-lived secrets
  • Audit all secret access across environments

Installation and Setup

Start Vault with Docker

compose.yml
services:
vault:
image: vault:latest
container_name: vault
ports:
- "8200:8200"
environment:
VAULT_DEV_ROOT_TOKEN_ID: "root"
VAULT_DEV_LISTEN_ADDRESS: "0.0.0.0:8200"
cap_add:
- IPC_LOCK
volumes:
- vault-data:/vault/data
command: server -dev
volumes:
vault-data:

Start the Vault server:

Terminal window
docker-compose up -d
export VAULT_ADDR='http://localhost:8200'
export VAULT_TOKEN='root'

CLI Examples

Basic Secret Storage

Terminal window
# Store a static secret
vault kv put secret/dev/database \
username="dbuser" \
password="supersecret" \
host="db.dev.internal"
# Retrieve secret
vault kv get secret/dev/database
# Get specific field
vault kv get -field=password secret/dev/database

Organize Secrets by Environment

Terminal window
# Development secrets
vault kv put secret/dev/app-api \
api_key="dev-key-12345" \
api_secret="dev-secret-67890"
# Staging secrets
vault kv put secret/staging/app-api \
api_key="staging-key-abcde" \
api_secret="staging-secret-fghij"
# Production secrets
vault kv put secret/prod/app-api \
api_key="prod-key-xyz123" \
api_secret="prod-secret-abc456"

Programming Examples

Python: Fetch Secrets

fetch_vault_secrets.py
import hvac
import os
from typing import Dict, Any
class VaultClient:
def __init__(self, vault_addr: str = "http://localhost:8200", token: str = None):
self.client = hvac.Client(url=vault_addr, token=token or os.getenv('VAULT_TOKEN'))
def get_secret(self, path: str, field: str = None) -> Dict[str, Any]:
"""Fetch a secret from Vault"""
response = self.client.secrets.kv.v2.read_secret_version(path=path)
data = response['data']['data']
if field:
return data.get(field)
return data
def get_database_creds(self, environment: str) -> Dict[str, str]:
"""Get database credentials for specific environment"""
path = f"secret/{environment}/database"
return self.get_secret(path)
def get_api_keys(self, environment: str) -> Dict[str, str]:
"""Get API keys for specific environment"""
path = f"secret/{environment}/app-api"
return self.get_secret(path)
# Usage
if __name__ == "__main__":
vault = VaultClient()
# Get dev database credentials
db_creds = vault.get_database_creds("dev")
print(f"Database: {db_creds['host']}")
print(f"User: {db_creds['username']}")
# Get specific field
api_key = vault.get_secret("secret/prod/app-api", field="api_key")
print(f"API Key: {api_key}")

Node.js: Fetch Secrets

vaultClient.ts
import * as VaultClient from 'node-vault';
class SecretManager {
private vault: ReturnType<typeof VaultClient>;
constructor(address: string = 'http://localhost:8200', token: string) {
this.vault = VaultClient({
apiVersion: 'v1',
endpoint: address,
token: token || process.env.VAULT_TOKEN,
});
}
async getSecret(path: string, field?: string): Promise<any> {
try {
const response = await this.vault.read(path);
const data = response.data.data;
if (field) {
return data[field];
}
return data;
} catch (error) {
console.error(`Failed to retrieve secret from ${path}:`, error);
throw error;
}
}
async getDatabaseConfig(environment: string): Promise<any> {
return this.getSecret(`secret/${environment}/database`);
}
async getApiCredentials(environment: string): Promise<any> {
return this.getSecret(`secret/${environment}/app-api`);
}
}
// Usage
(async () => {
const manager = new SecretManager('http://localhost:8200', 'root');
const dbConfig = await manager.getDatabaseConfig('dev');
console.log(`Connecting to: ${dbConfig.host}`);
const apiKey = await manager.getApiCredentials('prod');
console.log(`API Key: ${apiKey.api_key}`);
})();

Dynamic Database Credentials

Configure Database Secret Engine

Terminal window
# Enable database secrets engine
vault secrets enable database
# Configure PostgreSQL connection
vault write database/config/postgresql \
plugin_name=postgresql-database-plugin \
allowed_roles="readonly,readwrite" \
connection_url="postgresql://admin:password@db.prod.internal:5432/appdb" \
username="vault_admin" \
password="vault_admin_password"
# Create readonly role
vault write database/roles/readonly \
db_name=postgresql \
creation_statements="CREATE ROLE \"{{name}}\" WITH LOGIN PASSWORD '{{password}}' VALID UNTIL '{{expiration}}'; GRANT SELECT ON ALL TABLES IN SCHEMA public TO \"{{name}}\";" \
default_ttl="1h" \
max_ttl="24h"
# Create readwrite role
vault write database/roles/readwrite \
db_name=postgresql \
creation_statements="CREATE ROLE \"{{name}}\" WITH LOGIN PASSWORD '{{password}}' VALID UNTIL '{{expiration}}'; GRANT SELECT, INSERT, UPDATE ON ALL TABLES IN SCHEMA public TO \"{{name}}\";" \
default_ttl="6h" \
max_ttl="24h"

Retrieve Dynamic Credentials

Terminal window
# Get temporary readonly credentials
vault read database/creds/readonly
# Output:
# Key Value
# --- -----
# lease_duration 1h
# lease_id database/creds/readonly/...
# password a-temporary-password
# username v-token-readonly-xxxxxxxx
# Get temporary readwrite credentials
vault read database/creds/readwrite

Python: Use Dynamic Credentials

dynamic_db_access.py
import hvac
import psycopg2
from typing import Generator
import contextlib
class DynamicDatabaseAccess:
def __init__(self, vault_addr: str, vault_token: str):
self.vault = hvac.Client(url=vault_addr, token=vault_token)
self.db_host = os.getenv('DB_HOST', 'db.prod.internal')
self.db_port = os.getenv('DB_PORT', '5432')
self.db_name = os.getenv('DB_NAME', 'appdb')
def get_dynamic_credentials(self, role: str) -> dict:
"""Fetch temporary database credentials from Vault"""
response = self.vault.secrets.database.read_dynamic_credentials(role)
return response['data']
@contextlib.contextmanager
def get_connection(self, role: str = 'readonly') -> Generator:
"""Get a database connection with dynamic credentials"""
creds = self.get_dynamic_credentials(role)
conn = psycopg2.connect(
host=self.db_host,
port=self.db_port,
database=self.db_name,
user=creds['username'],
password=creds['password']
)
try:
yield conn
finally:
conn.close()
# Usage
db_access = DynamicDatabaseAccess('http://localhost:8200', 'root')
with db_access.get_connection(role='readonly') as conn:
cursor = conn.cursor()
cursor.execute("SELECT COUNT(*) FROM users")
print(f"Total users: {cursor.fetchone()[0]}")

Policy-Based Access Control

Create Policies for Different Roles

policies/dev-team.hcl
# Read-only access to dev/staging secrets
path "secret/data/dev/*" {
capabilities = ["read", "list"]
}
path "secret/data/staging/*" {
capabilities = ["read", "list"]
}
# Deny access to production
path "secret/data/prod/*" {
capabilities = ["deny"]
}
# Allow token self-renewal
path "auth/token/renew-self" {
capabilities = ["update"]
}
policies/prod-team.hcl
# Full access to production secrets
path "secret/data/prod/*" {
capabilities = ["create", "read", "update", "delete", "list"]
}
# Read-only access to staging
path "secret/data/staging/*" {
capabilities = ["read", "list"]
}
# Database credential access
path "database/creds/prod/*" {
capabilities = ["read"]
}
# Audit logging
path "sys/audit" {
capabilities = ["read"]
}

Apply Policies

Terminal window
# Create policies
vault policy write dev-team policies/dev-team.hcl
vault policy write prod-team policies/prod-team.hcl
# Assign to users/apps
vault write auth/userpass/users/jane policies="prod-team"
vault write auth/userpass/users/alice policies="dev-team"

Kubernetes Integration

Enable Kubernetes Auth

Terminal window
# Enable Kubernetes authentication
vault auth enable kubernetes
# Configure Kubernetes auth
vault write auth/kubernetes/config \
token_reviewer_jwt=@/var/run/secrets/kubernetes.io/serviceaccount/token \
kubernetes_host="https://$KUBERNETES_SERVICE_HOST:$KUBERNETES_SERVICE_PORT" \
kubernetes_ca_cert=@/var/run/secrets/kubernetes.io/serviceaccount/ca.crt
# Create role for pods
vault write auth/kubernetes/role/app-role \
bound_service_account_names=app \
bound_service_account_namespaces=default \
policies=default,app-secrets \
ttl=24h

Pod Configuration

kubernetes-deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
name: app
spec:
replicas: 2
selector:
matchLabels:
app: myapp
template:
metadata:
labels:
app: myapp
spec:
serviceAccountName: app
containers:
- name: myapp
image: myapp:latest
env:
- name: VAULT_ADDR
value: 'http://vault.vault.svc.cluster.local:8200'
- name: VAULT_SKIP_VERIFY
value: 'false'
volumeMounts:
- name: vault-token
mountPath: /vault/secrets
volumes:
- name: vault-token
projected:
sources:
- serviceAccountToken:
audience: vault
expirationSeconds: 3600
path: token

Audit Logging

Enable Audit Logging

Terminal window
# Enable file audit backend
vault audit enable file file_path=/vault/logs/audit.log
# Enable syslog audit backend
vault audit enable syslog tag="vault"
# View audit logs
tail -f /vault/logs/audit.log | jq '.'

Query Audit Logs

Terminal window
# Show all secret reads
cat /vault/logs/audit.log | jq 'select(.type == "response" and .auth.policy_results.granted_policies >= 0)'
# Show secret modifications
cat /vault/logs/audit.log | jq 'select(.type == "request" and .request.operation == "write")'
# Show failed auth attempts
cat /vault/logs/audit.log | jq 'select(.response.auth == null)'

Best Practices

1. Use AppRole for Applications

Terminal window
# Create AppRole
vault auth enable approle
# Generate application role
vault write auth/approle/role/my-app \
token_ttl=1h \
token_max_ttl=4h \
policies="app-secrets"
# Get role ID
vault read auth/approle/role/my-app/role-id
# Generate secret ID
vault write -f auth/approle/role/my-app/secret-id

2. Rotate Secrets Regularly

Terminal window
# Configure auto-rotation every 30 days
vault write database/config/postgresql \
connection_url="postgresql://admin:password@db.prod.internal:5432/appdb" \
rotation_statements="ALTER ROLE \"{{name}}\" WITH PASSWORD '{{password}}';" \
rotation_period=720h

3. Implement Least Privilege

# Only grant necessary capabilities
path "secret/data/myapp/*" {
capabilities = ["read"] # Not "update" or "delete"
}

Resources

Related Posts

You might also enjoy

Check out some of our other posts on similar topics

Essential Bash Variables for Every Script

Essential Bash Variables for Every Script

Overview Quick Tip You know what's worse than writing scripts? Writing scripts that break every time you move them to a different machine. Let's fix that with some built-in Bash variables tha

Check S3 Bucket Existence

Check S3 Bucket Existence

'Cloud' 'Automation' isCodeSnippet: true draft: falseQuick Tip Don’t let your deployment blow up because of a missing S3 bucket. This Bash script lets you check if a bucket exists

AWS Secrets Manager

AWS Secrets Manager

Need to load secrets in your Node.js app without exposing them? Here's how. If you're still storing API keys or database credentials in .env files or hardcoding them into your codebase, it's ti

List S3 Buckets

List S3 Buckets

Overview Multi-Profile S3 Management Multi-Profile S3 Safari! Ever juggled multiple AWS accounts and needed a quick S3 bucket inventory across all of them? This Python script is your guid

Why printf Beats echo in Linux Scripts

Why printf Beats echo in Linux Scripts

Scripting Tip You know that feeling when a script works perfectly on your machine but fails miserably somewhere else? That's probably because you're using echo for output. Let me show you why pri

Top 7 Open Source OCR Models for Document Processing

Top 7 Open Source OCR Models for Document Processing

AI Tool Turn your documents into perfect digital copies with these powerful open source OCR models. No more dealing with messy text extraction get clean, accurate markdown from PDFs, images, and

6 related posts