'Pyspark error: Java gateway process exited before sending its port number

I am using Pyspark to run some commands in Jupyter Notebook but it is throwing error. I tried solutions provided in this link (Pyspark: Exception: Java gateway process exited before sending the driver its port number) and I tried doing the solution provided here (such as Changing the path to C:Java, Uninstalling Java SDK 10 and reinstalling Java 8, still it is throwing me the same error.

I tried uninstalling and reinstalling pyspark, and I tried running from anaconda prompt as well still I am getting the same error. I am using Python 3.7 and pyspark version is 2.4.0.

If I use this code, I get this error."Exception: Java gateway process exited before sending its port number".

from pyspark import SparkContext
from pyspark.sql import SQLContext
sc = SparkContext() 
sqlContext = SQLContext(sc)
from pyspark.mllib.linalg import Vector, Vectors
from nltk.stem.wordnet import WordNetLemmatizer
from pyspark.ml.feature import RegexTokenizer, StopWordsRemover, Word2Vec

But If I remove sparkcontext from this code runs fine, but I would need spark context for my solution. Below code without spark context does not throw any error.

from pyspark import SparkContext
from pyspark.sql import SQLContext
from pyspark.mllib.linalg import Vector, Vectors
from nltk.stem.wordnet import WordNetLemmatizer
from pyspark.ml.feature import RegexTokenizer, StopWordsRemover, Word2Vec

I would appreciate if I could get any help figuring this out. I am using Windows 10 64 bit operating system.

Here is full error code picture.

enter image description here



Solution 1:[1]

Type this in you bash terminal, and it will be fixed:

export PYSPARK_SUBMIT_ARGS="--master local[2] pyspark-shell"

All this does is export pyspark-shell to the shell environment variable PYSPARK_SUBMIT_ARGS.

Solution 2:[2]

Try This

sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java8-installer

Worked for me using linux.It should work for windows too

this link will help you for coz you are an windows user https://superuser.com/questions/947220/how-to-install-packages-apt-get-install-in-windows

Solution 3:[3]

How did you install spark?? Clearly, you are having trouble starting a java process, which is what that error means.

You may want to install Spark again using the instructions to the letter, wherever you found them. However, you could also use conda, (anaconda or miniconda), in which case installing pyspark will also get a current java for you

conda install pyspark

Solution 4:[4]

I have faced same issue then I installed jdk 8 not into program files but in a new separate folder called Java and issue got resolved.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 xilpex
Solution 2 ??????
Solution 3 mdurant
Solution 4 Data_guy