'Specifying jceks file with Spark JDBC

I am trying to connect to Oracle via the sqlContext.read.format("json") method. Everything is going good but while creating the JDBC String, I have to specify the username and password for the database in the string :

val jdbcString = "jdbc:oracle:thin:USERNAME/PASSWORD@//HOSTNAME:PORT/SID"

However, I do have a jceks file present on HDFS which contains the password. I was wondering if there is any way that I can leverage that file to connect to JDBC instead of plain-text password? Like in Sqoop, we can do :

sqoop import -Dhadoop.security.credential.provider.path=jceks://hdfs/data/credentials/oracle.password.jceks

Thanks.



Solution 1:[1]

This was achieved using CredentialProviderFactory.

import org.apache.hadoop.security.alias.CredentialProviderFactory

val conf = new org.apache.hadoop.conf.Configuration()
val alias = "password.alias"
val jceksPath = "jceks://hdfs/user/data/alias/MySQL.password.jceks"

conf.set(CredentialProviderFactory.CREDENTIAL_PROVIDER_PATH, jceksPath)

//getPassword Returns Array[Char]
val password = conf.getPassword(alias).mkString

Solution 2:[2]

The Oracle JDBC thin driver doesn't support extracting the password from a jceks file. It supports wallets instead (password can be stored in a wallet).

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2 Jean de Lavarene