'Spark How to add value in a Hashmap from RDD?

I have below Data frame

val df = phDF.groupBy("name").agg(collect_list("message").as("Messages"))

I got below output

+-----------+--------------------+
|name       |Messages            |
+-----------+--------------------+
|     Test1 |['A','B','C']       |
|     Test2 |['A','B','C','D']   |
|     Test3 |['A','B']           |
+-----------+--------------------+

Now I want to add above name (as a Key) and Message (as a value) into a Hashmap.

I have used below approach to convert it into RDD but not getting any clue

var m = scala.collection.mutable.Map[String, String]()
val rdd = df.rdd.map(_.mkString("##"))
val rdd1 = rdd.map(s=>s.split("##"))
val rdd2 = rdd1.map(ele=>m.put(ele(0),ele(1)))
print(m)   // Output:- HashMap()

As above when I try to print hashMap then I am getting blank

Does anyone can help me how could I store this value in HashMap as below like?

Map("Test1" -> "['A','B','C']" ,"Test2" -> "['A','B','C','D']","Test3" -> "['A','B']")



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source