'spark sql to read time zone date

I am getting date and time in EDT format something like

"May 15 2022 23:29:08.607 EDT"

I want to convert that date time into unix_time so I am trying

spark.sql("""select unix_time (May 15 2022 23:29:08.607 EDT, "MMM dd yyyy HH:mm:ss") as new_date from table """)

What I need to mention in time format so that spark will know the input date in EDT timezone , I tried something as MMM dd yyyy HH:mm:ss zzz but it did not help.



Solution 1:[1]

You can use SET TIME ZONE. see here.
See also this question

To specify the time zone as part of the query you can use to_utc_timestamp function

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1