'Spark - Connection Refused for local file in Scala, works for Pyspark
I am trying to read a file from my local drive. When I use spark-shell I get:
Caused by: java.io.IOException: Failed to connect to /192.168.86.248:63393
at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:253)
at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:195)
at org.apache.spark.rpc.netty.NettyRpcEnv.downloadClient(NettyRpcEnv.scala:392)
at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$openChannel$4(NettyRpcEnv.scala:360)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1411)
at org.apache.spark.rpc.netty.NettyRpcEnv.openChannel(NettyRpcEnv.scala:359)
at org.apache.spark.repl.ExecutorClassLoader.getClassFileInputStreamFromSparkRPC(ExecutorClassLoader.scala:135)
at org.apache.spark.repl.ExecutorClassLoader.$anonfun$fetchFn$1(ExecutorClassLoader.scala:66)
at org.apache.spark.repl.ExecutorClassLoader.findClassLocally(ExecutorClassLoader.scala:176)
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:113)
... 103 more
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException: Connection refused: /192.168.86.248:63393
Caused by: java.net.ConnectException: Connection refused
What could be causing this?
My file read command:
val sysIdOwnersDF= spark.read.json("file:///Users/dev/spark-playground/owner-map.json")
I tried using pyspark to see if I could narrow down a configuration issue, however, using pyspark, I am successful:
>>> df = spark.read.json("file:///Users/dev/spark-playground/owner-map.json")
>>> df.show()
+--------------------+
| system-id-owners|
+--------------------+
|[a, aaa123, f, cc...|
|[a, aaa124, f, cc...|
+--------------------+
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
