'Structuring Applikation Logs to be human readable and be parsable by filebeat in a Kubernetes setup
I am trying to propably configure my application logs in a kubernetes environment with filebeat and Kibana.
A have a php symfony application where a lot of commands are executed on the cli. On my local docker setup I have configured my logging that all logs a writen as a json formatted string to a rotating log file and at the same time these log entries are outputed in the console in a human readable format (for example stacktraces are indented and colored). The directory with the written log files are mounted into my filebeat container and json parsed by filebeat and sent to Elasticsearch to be displayed in kibana. In this setup everything works as expected.
In Kubernetes as I understood all cli output should be sent to stderr in order for Kubernetes to pick it up and write it to a centralised file on the node. There it is parsed by filebeat again and sent to Elasticsearch. For me this means I only have one channel to output my log message. Either I format them as a json string so that it can be easily parsed by filebeat but is hard to read by a human or I format the message in a way that is human readable but can not be parsed well.
One could argue a user should not connect to a kubernetes container to manually execute commands and thus there is no human readable format needed. However in practice on our test/staging environments this happens in debugging situations.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
