'How to delete some data from External Table Databricks

I am trying to delete some data from Azure SQL from Databricks using JDBC, it generate error each time. I have very simple query delete from table1 where date>'2022-05-01'.

I searched many documents online but did not find any appropriate solution for this. Please find below code.

jdbcUsername = "userName"
jdbcPassword = "password" #these from Azure Key Vault

jdbcHostname = "host server name"
jdbcPort = "1433"
jdbcDatabase = "db_test"
jdbcUrl = "jdbc:sqlserver://{0}:{1};database={2}".format(jdbcHostname, jdbcPort, jdbcDatabase)
connectionProperties = {
  "user" : jdbcUsername,
  "password" : jdbcPassword,
  "driver" : "com.microsoft.sqlserver.jdbc.SQLServerDriver"
} 


pushdown_delete_query = f"(delete from table1 where date>'2022-05-01') table_alias"
print(pushdown_delete_query)

spark.read.jdbc(url=jdbcUrl, table=pushdown_delete_query, properties=connectionProperties)

the query return error com.microsoft.sqlserver.jdbc.SQLServerException: A nested INSERT, UPDATE, DELETE, or MERGE statement must have an OUTPUT clause



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source