'Spark history-server stderr and stdout logs location when working on S3

I deployed spark-history server that supposes to serve multi-environments, all spark clusters will write to one bucket and the history-server will read from that bucket.

I got everything working and set up but when I try to access the stdout/stderr of a certain task it's addressing the private IP of the worker that the task was running on (e.g- http://10.192.21.80:8081/logPage/?appId=app-20220510103043-0001&executorId=1&logType=stderr).

I want to access those logs from the UI, but of course, there is no access to those internal IP's (private subnets and private IP's), Isn't there a way to also upload those stderr/stdout logs to the bucket and then access it from the history server UI?

I couldn't find anything in the documentation

enter image description here



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source