'Getting UNRESOLVED DEPENDENCIES error when try connect snowflake with pyspark

Trying to connect snowflake database using Pysaprk. and below is my code.

from pyspark.sql import SparkSession
import os

os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages net.snowflake:snowflake-jdbc:3.13.14,net.snowflake:spark-snowflake_2.12:2.10.0-spark_3.2 pyspark-shell'

sfoptions = {
"sfUrl" : URL,
"sfUser" : usr,
"sfPassword" : paswd,
"sfAccount" : account,
"sfDatabase" : database,
"sfSchema" : schema,
"sfWarehouse" : warehouse,
"sfRole" : role,
}
#
query="select count(*) from table"

spark = SparkSession.builder.appName("demo").master("local").getOrCreate()

SNOWFLAKE_SOURCE_NAME="net.snowflake.spark.snowflake"    
df=spark.read.format(SNOWFLAKE_SOURCE_NAME).options(**sfoptions).option("query",query).load()
df.show()

But getting below error,

net.snowflake#snowflake-jdbc added as a dependency
net.snowflake#spark-snowflake_2.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-2cb3619a-01c7-4bb3-b74e-ec747c450381;1.0
    confs: [default]
You probably access the destination server through a proxy server that is not well configured.
You probably access the destination server through a proxy server that is not well configured.
You probably access the destination server through a proxy server that is not well configured.
You probably access the destination server through a proxy server that is not well configured.
You probably access the destination server through a proxy server that is not well configured.
You probably access the destination server through a proxy server that is not well configured.
You probably access the destination server through a proxy server that is not well configured.
You probably access the destination server through a proxy server that is not well configured.
:: resolution report :: resolve 543ms :: artifacts dl 1ms
    :: modules in use:
    ---------------------------------------------------------------------
    |                  |            modules            ||   artifacts   |
    |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
    ---------------------------------------------------------------------
    |      default     |   2   |   0   |   0   |   0   ||   0   |   0   |
    ---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
    Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/net/snowflake/snowflake-jdbc/3.13.14/snowflake-jdbc-3.13.14.pom

    Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/net/snowflake/snowflake-jdbc/3.13.14/snowflake-jdbc-3.13.14.jar

    Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/net/snowflake/snowflake-jdbc/3.13.14/snowflake-jdbc-3.13.14.pom

    Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/net/snowflake/snowflake-jdbc/3.13.14/snowflake-jdbc-3.13.14.jar

        module not found: net.snowflake#snowflake-jdbc;3.13.14

and its saying "UNRESOLVED DEPENDENCIES".. Its trying to access "repos.spark-packages.org" which is not available. Kindly help me to resolve this. How to add / mention snowflake and spark connectors from python code?

::::::::::::::::::::::::::::::::::::::::::::::

        ::          UNRESOLVED DEPENDENCIES         ::

        ::::::::::::::::::::::::::::::::::::::::::::::

        :: net.snowflake#snowflake-jdbc;3.13.14: not found

        :: net.snowflake#spark-snowflake_2.12;2.10.0-spark_3.2: not found

        ::::::::::::::::::::::::::::::::::::::::::::::


Solution 1:[1]

This seems to be a network issue that your current host does not have internet access, maybe you should check with your system admin to open up the internet connection.

You can check with the command:

ping repo1.maven.org

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Yilin MO