'Is it possible to persist .env values in the .whl files when installed on a Databricks cluster? I'd prefer to keep all values in library (.whl)
I have created a project in Pycharm. This project has a .py file with functions, init.py and a .env file with my secret values.
I need to be able to run this in Azure Databricks environment. Code works from IDE terminal but doesn't after importing the .whl library to Databricks cluster. I can import the .py with functions but when called it fails to pull values in .env.
I guess I don't fully understand the scope of the .env environment variables. I want to be able to give our developers the ability to just import and call function without any other code. Databricks via DBUtils does work as we have that set up to connect to Az vault. But I don't want to have developers code for that if I can hide those values in my python project. There will be several environment variables (secrets) so code would be as clean as it is below. I'd like to keep it behind the scenes if possible.
##> Databricks notebook pyspark code
from name.functions import *
##>call function in functions.py
fn_dothis()
I realize I could do this if I did something like this but ....
##> Databricks notebook pyspark code
from name.functions import *
secret = (dbutils.secrets.get(scope=xx,key=zz))
##>call function in functions.py
fn_dothis(secret)
##> authenicates now having passed the secret values.
it also works if I hard-code the value in the .py file. Of course I don't want to do this. How do I get the environment variable to be included in the .whl library? Is it possible?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|