'How do i send data and metrics from Confluent to elasticsearch?

The requirement is to send metrics and data from Confluent cloud to elastic cloud. Should I use logstash or is there any ways to implement this? Have read about elasticsearch syn connector but can someone help me give an overview of what all needs to be done?



Solution 1:[1]

So I used something called HTTP module to collect metrics from Confluent API as it supports json format. Hence we can use HTTP module with json metricset and then use a post request to fetch the API from confluent.

Use the HTTP module as shown below and also make sure that body is strigfied json.

    - module: http
      period: 10s
      metricsets:
        - json
      hosts: ["${HOST_NAME}"]
      headers :
        Content-Type: "application/json"
      path: "${PATH_NAME}"
      namespace: "confluent"
      body: "{ }"
      resource.kafka.id: "${KAFKA_ID}"
      resource.schema_registry.id: "${REG_ID}"
      method: "POST"
      response.enabled: true
      request.enabled: true
      username: "${USER_NAME}"
      password: "${PASSWORD}"

Solution 2:[2]

Confluent Cloud provides managed connectors, including one for Elastic cloud.

You can see an example of using it in this blog, but it's basically a point-and-click GUI to specify which Kafka topic to read data from and which Elasticsearch instance to stream it to

enter image description here enter image description here

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 sidharth vijayakumar
Solution 2 Robin Moffatt