'How to save a table in spark with dot in table name
I have a schema named xxx and a table named yyy, I want to have three databases called bronze, silver and gold, I want them all to live in the same schema, so I wanted to call the databases xxx.bronce.yyy, xxx.silver.yyy, etc.
def read_options(options, format):
if len(options)>1:
return getattr(read_options(options[1:], format), "option")(options[0][0], options[0][1])
else:
return getattr(spark.read.format(format), "option")(options[0][0], options[0][1])
def write_table(path, table_name, format, mount_name, database, schema="default", saveMode="overwrite", options=None):
if options:
df = read_options(options, format).load(path)
else:
df = spark.read.format(format).load(path)
df.write.mode(saveMode).saveAsTable(f"{schema}.{database}.{table_name}")
Here's my code, before I decided to write the tables as xxx.bronze.yyy, I was saving them just like xxx.yyy, so the last line of my code was:
df.write.mode(saveMode).saveAsTable(f"{schema}.{table_name}")
and it worked perfectly.
The error I am receiving is:
AnalysisException: Couldn't find a catalog to handle the identifier xxx.bronze.yyy.
in the saveAsTable line.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
