'list out hadoop yarn jobs that are using highest resources

i want to know how to list out jobs that are using highest memory and CPU, is there any command to list out highest memory used jobs? like i want to know how many vcores and memory a particular job is using.



Solution 1:[1]

You should be able to gather this data partially from the YARN UI, but I think you'd be better off installing Prometheus node/process exporters, or similar agents, directly on each machine that can gather information about the Linux process usage themselves

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 OneCricketeer