'How to add limit and from for conf input plugin in logstash
I have huge data in elastic index nearly 10 millions and i want to take a copy of index data about 5 millions records at once using logstash.
My conf file:
input {
elasticsearch {
hosts => ["project-prod-api.elastic.aws.org:443/"]
user => "jarvis_client"
password => "7CaVWWWUsjq0wiPmNZT3H"
index => "lion-jarvis-prd-new-4"
ca_file => "C:\Testing\KPNRootCertificate.crt"
ssl => true
query =>'{"query": {"bool": {"must": [],"filter": [ {"match_all": {} }]}},"size":6000000,"from":0}'
scroll => "5m"
docinfo => true
}
}
output {
csv {
fields =>["CARRIER_VENDOR_ID","CONTACT_NAME","CONTACT_PHONE","CUSTOMER","DSIP_ID","DSIP_PIN","HUISNUMMER","ISIP_ID","ISIP_PIN","ISRA","ORDERID","ORDERTYPE","ORDER_COMPLETED","ORDER_ENTRY","ORDER_EXECUTION","POSTCODE","ROOM","SERVICE_GROUP","SERVICE_ID","TECHNOLOGY_TYPE","TOEVOEGING","tags"]
path => "C:\Testing\sample.csv"
}
}
Is there any other approach to take copy of limited data from elastic index ?
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
