'How to take MAX from column values in SparkSQL

I have a use case where I need to take max value from different columns from a table in sparksql.

Below is a sample table -

enter image description here

I want to take the max of values from columns a, b and c without using the union clause.

Below is the SL query I executed -

SELECT (
SELECT MAX(myval) 
  FROM (VALUES (a),(b),(c)) AS temp(myval)
) AS MaxOfColumns
FROM
table

But this is throwing an error - "cannot evaluate expression outer() in inline table definition; line 3 pos 16"

Could you please help me with this?



Solution 1:[1]

array_max

with t(id,a,b,c) as (select stack(2 ,100,1,2,3 ,200,5,6,4))
select *, array_max(array(a,b,c)) as MaxOfColumns
from   t
id a b c MaxOfColumns
100 1 2 3 3
200 5 6 4 6

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1