'Spark: Find country by geographical coordinates
I have a spark dataframe created in Scala that contains geographical coordinates. I have to add column with country basing on this this geographical coordinates. I found some Python tools but as far as I know I can t use it in Scala code. Also I am not sure about efficiency if I have to process every row one by one by udf (it s about 50000 rows). Do you know how can I process this in the fastest possible way?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|