'pyspark to open directly in jupyter-lab
Installed below apps on windows 10
- Install apache spark 3.1.3
- Installed Hadoop 3.3.2
- Installed Jupyter-lab
When i execute pyspark or spark-shell from command line. I get the below output which mean apache spark got installed/configured correctly

When in execute pyspark from command line, i want jupyter-lab interface to be opened automatically. When i set the below environment variable jupyter notebook opens automatically
PYSPARK_DRIVER = C:\Users\xxxx\AppData\Local\Programs\Python\Python39\Scripts\jupyter.exe
PYSPARK_DRIVER_PYTHON_OPTS = notebook
I tried below setting, but no luck
PYSPARK_DRIVER = C:\Users\xxxx\AppData\Local\Programs\Python\Python39\Scripts\jupyter-lab.exe
PYSPARK_DRIVER_PYTHON_OPTS = lab
What environment variables, i need to set in order to open jupyter-lab directly. How to specify the kernel in jupyter kernels ?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
