'Fluentd: Copy logs locally as well push it to loki

I am using fluentd for log aggregation. My use case is I want to collect logs and store them in a single file as a backup as well as push it to loki to view in grafana.

Following is the config file:

<source>
  @type forward
  port 10091
  bind 0.0.0.0
  format none
</source>


<filter *.**>
   @type record_transformer
   enable_ruby true
   remove_keys source,container_id, container_name , ts, tsNs
   <record>
    service ${tag_parts[1]}

#     message ${record["msg"] ? record["msg"] : record["message"] ? record["message"] : record["MESSAGE"]}
  </record>
</filter>

<match SERVER1.*>
@type copy
  <store>
        @type loki
        url "http://LOKI_IP:PORT"
        flush_interval 1s
        flush_at_shutdown true
        buffer_chunk_limit 1m
        extra_labels {"agent":"SERVER1"}
        <label>
                filename
        </label>
  </store>
        <store>
                @type file
                path /fluentd/Log/SERVER1.%Y-%m-%d.%H:%M:%S.log
                <buffer time>
                        timekey 1h
                        timekey_use_utc true
                        timekey_wait 2s
                        flush_interval 1h
                </buffer>
        </store>
</match>

<match SERVER2.*>
@type copy
  <store>
        @type loki
        url "http://LOKI_IP:PORT"
        flush_interval 1s
        flush_at_shutdown true
        buffer_chunk_limit 1m
        extra_labels {"agent":"SERVER2"}
        <label>
                filename
        </label>
  </store>
        <store>
                @type file
                path /fluentd/Log/SERVER2.%Y-%m-%d.%H:%M:%S.log
                <buffer time>
                        timekey 1h
                        timekey_use_utc true
                        timekey_wait 2s
                        flush_interval 1h
                </buffer>
        </store>
</match>

Here SERVER1 and SERVER2 are sending logs to fluentd. I tried collecting logs from both servers in single file by adding

<source>
  @type forward
  port 10091
  bind 0.0.0.0
  format none
</source>


<filter *.**>
   @type record_transformer
   enable_ruby true
   remove_keys source,container_id, container_name , ts, tsNs
   <record>
    service ${tag_parts[1]}

#     message ${record["msg"] ? record["msg"] : record["message"] ? record["message"] : record["MESSAGE"]}
  </record>
</filter>

<match *.**>
@type copy
    <store>
        @type file
        path /fluentd/Log/access.%Y-%m-%d.%H:%M:%S.log
        <buffer time>
            timekey 10s
            timekey_use_utc true
            timekey_wait 2s
            flush_interval 10s
        </buffer>
    </store>
</match>

<match SERVER1.*>
@type copy
  <store>
        @type loki
        url "http://LOKI_IP:PORT"
        flush_interval 1s
        flush_at_shutdown true
        buffer_chunk_limit 1m
        extra_labels {"agent":"SERVER1"}
        <label>
                filename
        </label>
  </store>
</match>

<match SERVER2.*>
@type copy
  <store>
        @type loki
        url "http://LOKI_IP:PORT"
        flush_interval 1s
        flush_at_shutdown true
        buffer_chunk_limit 1m
        extra_labels {"agent":"SERVER2"}
        <label>
                filename
        </label>
  </store>
</match>

But with this, I am not able to see the latest logs in loki/grafana. Somehow it is only saving it locally and not sending it to loki.

So it's like SAVE IT LOCALLY or PUSH IT TO LOKI. I want both of these functionalities to have a single JSON file to have logs from both servers instead of different files from different servers. Can anyone help me out if it's possible at all?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source