'Pyspark: - Failed to initialise Spark session (Another SparkContext is being constructed)
HI I am pretty new to spark i want to use pyspark to stream data from Kafka to mongo but i am not able to run pyspark. and every-time i run it on terminal it gives following error.I have deleted and reinstalled Java Kafka Scala and pyspark multiple times but unable to resolve it found few methods tried to do them but unable to get it resolved. If it run spark shell on terminal it works while giving warning

and here is my pyspark and java version that i have right now:

If you have solution on this please help me with it i have stuck a wall with this error.
Solution 1:[1]
Hey guys if you are facing the same issue you can do what i did.
- i removed all spark and scala and java also pyspark
- reinstall
brew reinstall apache-spark - after that you cam use
pysparkorspark-shellto run it again.
it worked for me because my spark-shell was also giving an error so reinstalling apache spark was able to solve it.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Aman Verma |
