'while writing data in scylla table using pyspark.some time it write correctly sometime it did not write properly
I am using three node cluster of scylla table i am reading and doing some changes on that df .and writing this new df again the same table.it showing writing completed but actually it do not updated that table data.used pyspark for it.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
