'How to get Flink create table ddl from Hive Metastore

I have a few Flink tables which store in Hive Metastore. I want to pull all these tables back and generate create table ddl.

Here is a demo to describe what I want.
Such as I have a Flink table like

CREATE TABLE if not exists cdc_log (log STRING) 
WITH (
        'connector' = 'kafka',  
        'topic-pattern' = 'xxx',  
        'properties.bootstrap.servers' = 'xxx',  
        'properties.group.id' = 'xxx',  
        'scan.startup.mode' = 'xxx',  
        'format' = 'raw');

Hive cli execute show create table cdc_log we get follow DDL that can't be executed in Flink runtime.

CREATE TABLE `cdc_log`(
)
ROW FORMAT SERDE 
  'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe' 
STORED AS INPUTFORMAT 
  'org.apache.hadoop.mapred.TextInputFormat' 
OUTPUTFORMAT 
  'org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat'
LOCATION
  'hdfs://xxx'
TBLPROPERTIES (
  'flink.connector'='kafka', 
  'flink.format'='raw', 
  'flink.properties.bootstrap.servers'='xxx', 
  'flink.properties.group.id'='xxx', 
  'flink.scan.startup.mode'='earliest-offset', 
  'flink.schema.0.data-type'='VARCHAR(2147483647)', 
  'flink.schema.0.name'='log', 
  'flink.topic-pattern'='xxx', 
  'transient_lastDdlTime'='1645607171')

And Flink Describe cdc_log returns without WITH information. I can't find a way to get my DDL SQL back. Is there a way to solve this problem? Thanks.



Solution 1:[1]

LOL.... I think I can use CatalogBaseTable.getOptions() to solve it...

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 slo