'Reuse Spark Session Across Modules/Packages
We are building a reusable data framework using PySpark. As part of this, we had built one big utilities package that hosted all the methods. But now, we are planning to split it apart into smaller and more manageable packages.
How do we share the Spark Session and the Logger object across all the packages/modules?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|