'Storing default environment variables in Vault instead of env files in docker-compose for standard services
I have a docker-compose stack which uses standard software containers like:
- InfluxDB
- MariaDB
- Node-Red
running on a Industrial Single Board Computer (which may not be connected to the internet)
for initial setup (bringing the stack up), I pass some standard credentials like admin credentials via their environment variable files e.g. influxdb.env, mariadb.env etc.
A typical example of a docker-compose.yml here is:
services:
influxdb:
image: influxdb:2.0
env_file:
- influxdb.env
nodered:
image: nodered/node-red:2.2.2
env_file:
- node-red.env
An example of influxdb.env could be:
INFLUXDB_ADMIN_USER=admin
INFLUXDB_ADMIN_PASSWORD=password!#$2
# other env vars that might be crucial for initial stack boot up
These files are on the disk and can still be vulnerable. I wish to understand if Hashicorp Vault can provide a plausible solution where such credentials (secrets) can be stored as key-value pairs and be made available to the docker-compose services upon runtime.
I understand one bottleneck that since I am using standard containers (ready-to-use) and they may not have vault integration. However, can I still use vault to store the env vars and let the services access them on runtime? Or do I have to write side-cars for these containers and then let them accept these env var values?
Solution 1:[1]
You have a few constraints to work with here:
- Not storing secrets permanently in storage
docker-composecommand line- Vault's output format
Docker composer can read it's environment variables from a file. I suggest that you create that file and provide it to docker-compose with the --env-file parameter.
I can think of two approach to write that file:
- Write the output of multiple
vault kv getto a file, inNAME=VALUEformat - Use vault agent's template engine
The first option is quite straighforward. Call a function that outputs the secrets and send it to a file:
#!/bin/bash
function write_vault_secret_to_env_file() {
local ENVIRONMENT_VARIABLE_NAME=$1
local SECRET_PATH=$2
local SECRET_NAME=$3
echo "$ENVIRONMENT_VARIABLE_NAME=$(vault kv get --field $SECRET_NAME $SECRET_PATH)"
}
echo "$(write_vault_secret_to_env_file FIRST_ENVIROMENT_VAR secret/my-path/things first-secret)" >> my-env-file.sh
echo "$(write_vault_secret_to_env_file SECOND_ENVIROMENT_VAR secret/my-path/stuff second-secret)" >> my-env-file.sh
Vault agent 's template engine is much more powerfull, but is more complex to set up.
Another suggestion would be to use Vault's dynamic secrets for databases (InfluxDB is supported). But you need to provide Vault with DBA privileges in your database. If you create the database from scratch everytime, you could make the DBA password dba-root, give Vault that password and instruct it to rotate it for you.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 |
