'pyspark inserInto in overwrite mode is appending and not overwriting partitions

i'm a data engieneer im working on spark 2.3 , and i'm running into some problems :

the function inserInto like below is not insering in overwrite, but is appending even i changed the spark.conf to 'dynamic'


spark = spark_utils.getSparkInstance()
spark.conf.set('spark.sql.sources.partitionOverwriteMode', 'dynamic')

df\
.write\
.mode('overwrite')\
.format('orc')\
.option("compression","snappy")\
.insertInto("{0}.{1}".format(hive_database , src_table ))

each time i run the job, lines are appended in the partition and not overwrited any one passed through this probleme ? thank you



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source