'How to get date from different year, month and day columns in spark (scala)

I have a DataFrame including data like:

+----+-----+---+-----+
|Year|Month|Day|...  |
+----+-----+---+-----+
|2012|    2| 20|     |
|2011|    7|  6|     |
|2015|    3| 15|     |

and I would like to add a column with date



Solution 1:[1]

Not so complex as Shaido, just

df.withColumn("date", F.to_date(F.concat_ws("-", "Year", "Month", "Day")) ).show()

Work on spark 2.4 .

Solution 2:[2]

For Spark 3+, you can use make_date function:

df.withColumn("date", expr("make_date(Year, Month, Day)"))

Solution 3:[3]

You can just use the concat_ws function to create a date in string data type and just cast that to date.

import org.apache.spark.sql.functions._
import org.apache.spark.sql.types._
//Source Data
val df = Seq((2012,2,20),(2011,7,6),(2015,3,15)).toDF("Year","Month","Day")
//using concat_ws function to create Date column and cast that column data type to date
val df1 = df.withColumn("Date",concat_ws("-",$"Year",$"Month",$"Day"))
.withColumn("Date",$"Date".cast("Date"))
display(df1)

You can see the output as below :

enter image description here

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Mithril
Solution 2 blackbishop
Solution 3 Nikunj Kakadiya