'define environment variable in databricks init script

I want to define an environment variable in Databricks init script and then read it in Pyspark notebook. I wrote this:

    dbutils.fs.put("/databricks/scripts/initscript.sh","""
#!/bin/bash
export env="dev"
pip install pretty-html-table==0.9.14
""", True)

the pyspark code:

import os
environment=os.getenv("env")

it gives:

TypeError: can only concatenate str (not "NoneType") to str

but the Pyspark notebook is not able to read the environment variable properly

Any idea how to fix this ?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source