'Pyspark: - Failed to initialise Spark session (Another SparkContext is being constructed)

HI I am pretty new to spark i want to use pyspark to stream data from Kafka to mongo but i am not able to run pyspark. and every-time i run it on terminal it gives following error.I have deleted and reinstalled Java Kafka Scala and pyspark multiple times but unable to resolve it found few methods tried to do them but unable to get it resolved. If it run spark shell on terminal it works while giving warning

Here is image of complete error

and here is my pyspark and java version that i have right now:

version info

If you have solution on this please help me with it i have stuck a wall with this error.



Solution 1:[1]

Hey guys if you are facing the same issue you can do what i did.

  1. i removed all spark and scala and java also pyspark
  2. reinstall brew reinstall apache-spark
  3. after that you cam use pyspark or spark-shell to run it again.

it worked for me because my spark-shell was also giving an error so reinstalling apache spark was able to solve it.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Aman Verma