'Casting from double to decimal rounds columns in Scala Spark
I have a dataframe that looks like this when I print it using df.show():
|---id---|----val_dbl----|
| 1| 1.378332E-7|
| 2| 2.234551E-7|
| 3| 4.03E-7|
|--------|---------------|
with the schema
df.printSchema()
|-- id: integer
|-- val_dbl: double
Then I cast val_dbl to decimal:
df.withColumn('val_dec', col('val_dbl').cast(DecimalType(18,7))
But then I get this when I print:
|---id---|----val_dec----|
| 1| 1.38E-7|
| 2| 2.23E-7|
| 3| 4.03E-7|
|--------|---------------|
|-- id: integer
|-- val_dec: decimal(18,7)
But it should look like:
|---id---|----val_dec----|
| 1| 1.378332E-7|
| 2| 2.234551E-7|
| 3| 4.030000E-7|
|--------|---------------|
|-- id: integer
|-- val_dec: decimal(18,7)
How come it isn't printing out the full value? Is it actually rounding, or is this just for convenience? And how can I print the full decimal values? I can't imagine it is actually rounding.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
