'Unable to access remote hive from glue

I am trying to access remote hive(which is running on on-premisis) in glue with following code:

glue configuration:

glue version:2

spark : 2.4

In glue job, i attached the s3 path for jdbc.jar file(under dependent jar path)

from awsglue.transforms import *
from awsglue.utils import getResolvedOptions
from pyspark.context import SparkContext
from awsglue.context import GlueContext
from awsglue.job import Job
from py4j.java_gateway import java_import
sc = SparkContext()
glueContext = GlueContext(sc)
spark = glueContext.spark_session

hive_DF = spark.read \
    .format("jdbc") \
    .option("driver", "org.apache.hive.jdbc.HiveDriver") \
    .option("url", "jdbc:hive2://bigdatamr:10000/someendpoint")
    .option("dbtable", "tmp") \
    .option("user", "myusername") \
    .option("password", "xxxxxxx") \
    .load()
print(hive_DF .show())

But i am getting below error:

Command failed with exit code 10

i checked the log file, but didn't find something related to this error

so how to resolve this?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source