'spark creating and reading from table fails because of metastore version mismatch
I am seeing this error Builtin jars can only be used when hive execution version == hive metastore version. Execution: 2.3.7 != Metastore: 1.2.1. when trying to execute queries to create tables or read from them
Below is a line that reproduces the issue
df.write.mode("overwrite").saveAsTable("table")
Any advice much appreciated
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
