'No data from the Kafka Consumer Python - Consumer keeps listening but nothing comes out

I'm looking for a way to display my API (localhost) to my docker using kafka.

My producer (below) works like a charm. I know because when i print res.text, I have an output.

import json
import requests
from kafka import KafkaProducer
import time

# get data
res = requests.get('http://127.0.0.1:5000/twitter')
#print(res.text)

# use kafka

producer = KafkaProducer(bootstrap_servers=['localhost:9092'])#, api_version='2.0.0')
producer.send('test', json.dumps(res.text).encode('utf-8'))
time.sleep(1)
#producer.flush()

However, my Consumer doesn't work. Here is what i have tried so far.

Currently stopped at the for loop.
import kafka
import json
import requests
from kafka import KafkaConsumer

# utiliser kafka
consumer = KafkaConsumer('test', bootstrap_servers=['localhost:9092'], api_version='2.0.0', group_id="test_id", value_deserializer = json.loads)
print('before for ')
consumer.subscribe('test')
for msg in consumer:
    print('IN for')
    #print(type(consumer))
    print(json.loads(msg.value.decode()))
#print(consumer)

I'm missing something somewhere, but I can't figure what.

When I manually stop, I get the following error from docker :

<class 'kafka.consumer.group.KafkaConsumer'>
^CTraceback (most recent call last):
  File "consumer.py", line 11, in <module>
    for m in consumer:
  File "/usr/lib/python3.7/site-packages/kafka/consumer/group.py", line 1193, in __next__
    return self.next_v2()
  File "/usr/lib/python3.7/site-packages/kafka/consumer/group.py", line 1201, in next_v2
    return next(self._iterator)
  File "/usr/lib/python3.7/site-packages/kafka/consumer/group.py", line 1116, in _message_generator_v2
    record_map = self.poll(timeout_ms=timeout_ms, update_offsets=False)
  File "/usr/lib/python3.7/site-packages/kafka/consumer/group.py", line 655, in poll
    records = self._poll_once(remaining, max_records, update_offsets=update_offsets)
  File "/usr/lib/python3.7/site-packages/kafka/consumer/group.py", line 680, in _poll_once
    self._update_fetch_positions(self._subscription.missing_fetch_positions())
  File "/usr/lib/python3.7/site-packages/kafka/consumer/group.py", line 1112, in _update_fetch_positions
    self._fetcher.update_fetch_positions(partitions)
  File "/usr/lib/python3.7/site-packages/kafka/consumer/fetcher.py", line 186, in update_fetch_positions
    self._reset_offset(tp)
  File "/usr/lib/python3.7/site-packages/kafka/consumer/fetcher.py", line 237, in _reset_offset
    offsets = self._retrieve_offsets({partition: timestamp})
  File "/usr/lib/python3.7/site-packages/kafka/consumer/fetcher.py", line 302, in _retrieve_offsets
    time.sleep(self.config['retry_backoff_ms'] / 1000.0)
KeyboardInterrupt
version: "3.7"
services:

  spark-master:
    image: bde2020/spark-master:3.0.1-hadoop3.2
    ports:
      - "8080:8080"
      - "7077:7077"
    volumes:
       - ./work:/home/jovyan/work
    environment:
       - "SPARK_LOCAL_IP=spark-master"

  spark-worker:
    image: bde2020/spark-worker:3.0.1-hadoop3.2

    depends_on:
      - spark-master
    environment:
      - SPARK_MASTER=spark://spark-master:7077
      - SPARK_WORKER_CORES=2
      - SPARK_WORKER_MEMORY=3G
      - SPARK_DRIVER_MEMORY=2G
      - SPARK_EXECUTOR_MEMORY=2G
    volumes:
       - ./work:/home/jovyan/work

  pyspark-notebook:
    image: jupyter/pyspark-notebook
    container_name: pyspark_notebook
    ports:
      - "8888:8888"
    volumes:
      - ./work:/home/jovyan/work
      - ./work/model:/tmp/model_prediction
    environment:
      - PYSPARK_PYTHON=/usr/bin/python3
      - PYSPARK_DRIVER_PYTHON=ipython3

  zookeeper:
    image: wurstmeister/zookeeper:3.4.6
    expose:
    - "2181"

  kafka:
    image: wurstmeister/kafka:2.11-2.0.0
    depends_on:
    - zookeeper
    ports:
    - "9092:9092"
    expose:
    - "9093"
    environment:
      KAFKA_ADVERTISED_LISTENERS: INSIDE://kafka:9093,OUTSIDE://localhost:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT
      KAFKA_LISTENERS: INSIDE://0.0.0.0:9093,OUTSIDE://0.0.0.0:9092
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE

  mongo:
    image: mongo
    restart: always
    environment:
      MONGO_INITDB_ROOT_USERNAME: root
      MONGO_INITDB_ROOT_PASSWORD: example

  mongo-express:
    image: mongo-express
    restart: always
    ports:
      - 8081:8081
    environment:
      ME_CONFIG_MONGODB_ADMINUSERNAME: root
      ME_CONFIG_MONGODB_ADMINPASSWORD: example


Could you please help me?



Solution 1:[1]

I found what was wrong...
The docker image wasn't working.
I changed and it's working.

I made my own dockerfile.

Solution 2:[2]

Same docker compose

From host

Create a topic

$ docker-compose up -d
$ docker-compose exec kafka /opt/kafka/bin/kafka-topics.sh --create --topic test --zookeeper zookeeper:2181 --replication-factor 1 --partitions 1      
Created topic "test".
$ docker-compose exec kafka /opt/kafka/bin/kafka-topics.sh --list --zookeeper zookeeper:2181                            
test

Verify API is running

$ curl -H 'Content-Type: application/json' localhost:5000/twitter
{"tweet":"foobar"}

Install kafka-python and run producer (with uncommented flush)

$ pip install requests kafka-python
$ python producer.py

Verify data landed in topic

$ docker-compose exec kafka /opt/kafka/bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning
"{\"tweet\":\"foobar\"}\n"

From inside the container

Using pyspark notebook @ http://localhost:8888

Open a terminal tab

$ conda install kafka-python
(base) jovyan@3eaf696e1135:~$ python work/consumer.py
before for
IN for
{"tweet":"foobar"}

New consumer code

import kafka
import json
import requests
from kafka import KafkaConsumer

# utiliser kafka
consumer = KafkaConsumer('test',
    bootstrap_servers=['kafka:9093'],  # needs to be the kafka INSIDE:// listener address
    api_version='2.0.0',
    group_id="test_id",
    auto_offset_reset='earliest',  # you're missing this
    value_deserializer=json.loads)
print('before for ')
for msg in consumer:
    print('IN for')
    #print(type(consumer))
    print(msg.value)
#print(consumer)

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 textSolver34761
Solution 2