'Add table description to iceberg table from pyspark

I was able to add a table comment to an iceberg table using trino, with this trino command:

comment on table iceberg.table_schema.table_name is 'My Comment'

Also It is possible to read that from pyspark using:

spark.sql("describe extended iceberg.table_schema.table_name")

I couldn't find a way to insert these comments to the table using spark. Is there a way to do so? Thanks a lot



Solution 1:[1]

So found out a way to do it like this:

ALTER TABLE catalog.schema.table_name SET TBLPROPERTIES('comment'='a comment')

But i would still like to do that through saveAsTable if possible.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Itai Sevitt