'Pass json file into JAR or read from spark session

I have a Spark UDF written on Scala. I'd like to use my function with some additional files.

import scala.io.Source
import org.json4s.jackson.JsonMethods.parse
import org.json4s.DefaultFormats

object consts {
    implicit val formats = DefaultFormats
    val my_map = parse(Source.fromFile("src/main/resources/map.json").mkString).extract[Map[String, Map[String, List[String]]]]
}

Now I want to use my_map object inside UDF. So I basically do this:

import package.consts.my_map

object myUDFs{
 *bla-bla and use my_map*
}

I've already tested my function in a local, so it works well. Now I want to understand how to pack a jar file so that .json file stays there?

Thank you.



Solution 1:[1]

If you manage your project with Maven, you can place your .json file(s) under src/main/resources as it's the default place where Maven looks for your project's resources.

You also can define a custom path for your resources as described here: https://maven.apache.org/plugins/maven-resources-plugin/examples/resource-directory.html

Solution 2:[2]

UPD: I managed to do so by creating fatJar and reading my resource file this way:

    Source
      .fromInputStream(
        getClass.getClassLoader.getResourceAsStream("map.json")
      )
      .mkString
  ).extract[Map[String, Map[String, List[String]]]]

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Gabio
Solution 2 Huvi