'Apache Spark Connector Driver not suitable
Trying to connect to sqlserver using Azure Apache Spark connector and getting the following error
java.sql.SQLException: No suitable driver
Databricks cluster has com.microsoft.azure:spark-mssql-connector_2.12:1.2.0 for Apache Spark 3.1.2, Scala 2.12 as stated my documentation
That's the only library I have installed on the cluster.
Went through docs on https://github.com/microsoft/sql-spark-connector
jdbcHostname = "server name"
jdbcPort = 1433
jdbcDatabase = "database name"
Table="tbl.name"
JDBC_URL='"jdbc:sqlserver://{0}:{1};database={2}"'.format(jdbcHostname,jdbcPort,jdbcDatabase)
username="user"
password="pass"
jdbcDF = spark.read \
.format("com.microsoft.sqlserver.jdbc.spark") \
.option("url", JDBC_URL) \
.option("dbtable", Table) \
.option("user", username) \
.option("password", password).load()
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
