'Spark NumberFormatException when using spark.read.jdbc from Hive by JDBCprotocal

The problem occurs when using spark.read.jdbc() ↓

spark = SparkSession.builder \
    .[some options...]\
    .config("spark.sql.warehouse.dir", "hdfs://ip:port/path") \
    .config("hive.metastore.uris", "thrift://ip:port") \
    .enableHiveSupport() \
    .getOrCreate()

spark.read.jdbc(url="jdbc:hive2://ip:port", table="db.table", properties={...})

java.sql.SQLException: Cannot convert column 1 to integerjava.lang.NumberFormatException: For input string: "c1"

"c1" is the 1st column of table with simply values (1, 2, 3, 4)

However, if I read data with

spark.read.table("db.table")

↑ it does successfully work

Is there any suggestions that could make me resolve this problem?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source