'pyspark insert failed using spark.read method

def QueryDB(sqlQuery):
  jdbcUrl = mssparkutils.credentials.getSecret("param1","DBJDBCConntring","param3")
  spark=SparkSession.builder.appName("show results").getOrCreate()
  
  dbcdf = (spark.read.format("jdbc")
                    .option("url", jdbcUrl)
                    .option("query", sqlQuery)
                    .load()
              )

  return jdbcdf  

df= QueryDB("INSERT INTO schema.table1 (column1, column2) output inserted.column1 values('one', 'two')")
df.show()

the notebook runs without any error but no rows are inserted. any suggestion or sample code to insert into table.



Solution 1:[1]

spark.read.format("jdbc") is to read JDBC. If you want to insert data to JDBC you'd want something like this

jdbcDF.write \
    .format("jdbc") \
    .option("url", "jdbc:postgresql:dbserver") \
    .option("dbtable", "schema.tablename") \
    .option("user", "username") \
    .option("password", "password") \
    .save()

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 pltc