'Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.datastax.spark#spark-cassandra-connector_2.10;2.0.3: not found]
I've installed on my Centos 7 machine :
spark version 2.2.0
scala version 2.11.8
java version 1.8.0.0_144
Cassandra 3.11.0
So the next step is configure spark to work with cassandra via the Spark Cassandra Connector , the problem is that when I try to run
$SPARK_HOME/bin/spark-shell --packages datastax:spark-cassandra-connector:2.0.3-s_2.11
Note that I tried also this :
$SPARK_HOME/bin/spark-shell --packages com.datastax.spark:spark-cassandra-connector_2.10:2.0.3
I got :
...
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: com.datastax.spark#spark-cassandra-connector_2.10;2.0.3: not found
::::::::::::::::::::::::::::::::::::::::::::::
...
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.datastax.spark#spark-cassandra-connector_2.10;2.0.3: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1177)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:298)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
What I'm doing wrong ? I notice that those version that I'm using ( for scala, spark and cassandra) doesn't appears in the compatibility version here spark-cassandra-connector github website
Solution 1:[1]
Try downloading the spark-cassandra-connector_2.10-2.0.3.jar file and adding it into the $SPARK_HOME/jars folder
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Tanvi - |
