'"Spark 2.4.3 S3 MultiObjectDeleteException " in sparklyr
I am really struggling with this error which I keep getting whenever I try to do the following . And I tried all the suggestions provided in the old links as well , but in vain.
spark_write_csv(final_data_3,path = "mypath",header = T,mode = "overwrite",infer_schema=F)
This is the error I get
Spark 2.4.0 S3 MultiObjectDeleteException
I was reading through old answers for similar problem and it suggested it should below configuration in settings to avoid this error .
I added this and I also used true in place of false in script , still am getting same error .
hconf %>% sparklyr::invoke("set","fs.s3a.multiobjectdelete.enable","false")
Any suggestions to address this error ?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
