'How to read text file in Logstash pipeline
I have Logstash, Elasticsearch and Kibana up and running. I am passing info.txt
file as an input in Elasticsearch pipeline. Logstash is reading the file but logs are not printing and also unable to create index in Elasticsearch.
Below is my log file:
info.txt
[92m[MyApp][39m [33mInfo[39m 16/3/2022, 4:54:25 pm [92mHello[39m - {}
[92m[MyApp][39m [33mInfo[39m 16/3/2022, 4:54:27 pm [92mHello[39m - {}
[92m[MyApp][39m [33mInfo[39m 16/3/2022, 5:04:31 pm [92mHello[39m - {}
Below is my logstash pipeline:
logstash.conf
input{
file{
path => "D:/nest/es-logging-example/log/info/*.txt"
start_position => beginning
sincedb_path => "NULL"
}
}
output{
elasticsearch{
hosts => "localhost:9200"
index => "myapplogs"
}
stdout{}
}
Below is logstash output on doing logstash -f logstash.conf
[2022-03-16T16:41:11,277][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch Output
configured with `ecs_compatibility => v8`, which resolved to an UNRELEASED preview of version
8.0.0 of the Elastic Common Schema. Once ECS v8 and an updated release of this plugin are
publicly available, you will need to update this plugin to resolve this warning.
[2022-03-16T16:41:11,277][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant
with data streams. `data_stream => auto` resolved to `false`
[2022-03-16T16:41:11,310][INFO ][logstash.filters.csv ][main] ECS compatibility is enabled
but `target` option was not specified. This may cause fields to be set at the top-level of the
event where they are likely to clash with the Elastic Common Schema. It is recommended to set
the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-
conflicting, feel free to ignore this message)
[2022-03-16T16:41:11,395][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping
template {:es_version=>8, :ecs_compatibility=>:v8}
[2022-03-16T16:41:11,505][INFO ][logstash.javapipeline ][main] Starting pipeline
{:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125,
"pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["D:/logstash-
8.1.0/logstash.conf"], :thread=>"#<Thread:0x224314a3 run>"}
[2022-03-16T16:41:13,051][INFO ][logstash.javapipeline ][main] Pipeline Java execution
initialization time {"seconds"=>1.53}
[2022-03-16T16:41:13,153][INFO ][logstash.javapipeline ][main] Pipeline started
{"pipeline.id"=>"main"}
[2022-03-16T16:41:13,244][INFO ][filewatch.observingtail ][main]
[c87aefd48e4743b9f32dad848c60392fc5da55d4cdae1e2d620707d1802f9cdf] START, creating Discoverer,
Watch with file and sincedb collections
[2022-03-16T16:41:13,285][INFO ][logstash.agent ] Pipelines running {:count=>1,
:running_pipelines=>[:main], :non_running_pipelines=>[]}
What I am doing wrong?
Solution 1:[1]
You need to set: sincedb_path => "/dev/null" and not "NULL" in the input, if you want to reread the file.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Vakhtang |