'Request to Django views that start Celery tasks time out

I'm deploying a Django app with Docker.

version: '3.1'

services:
  b2_nginx:
    build: ./nginx
    container_name: b2_nginx
    ports:
      - 1904:80
    volumes:
      - ./app/cv_baza/static:/static:ro
    restart: always

  b2_app:
    build: ./app
    container_name: b2_app 
    volumes:
    - ./app/cv_baza:/app
    restart: always

  b2_db:
    container_name: b2_db
    image: mysql
    command: --default-authentication-plugin=mysql_native_password
    restart: always
    environment:
      MYSQL_ROOT_PASSWORD: -
      MYSQL_DATABASE: cvbaza2
    volumes:
      - ./db:/var/lib/mysql
      - ./init:/docker-entrypoint-initdb.d
  rabbitmq:
    container_name: b2_rabbit
    hostname: rabbitmq
    image: rabbitmq:latest
    ports:
      - "5672:5672"
    restart: on-failure
  celery_worker:
    build: ./app
    command: sh -c "celery -A cv_baza worker -l info"
    container_name: celery_worker
    volumes:
      - ./app/cv_baza:/app
    depends_on:
      - b2_app
      - b2_db
      - rabbitmq
    hostname: celery_worker
    restart: on-failure

  celery_beat:
    build: ./app
    command: sh -c "celery -A cv_baza beat -l info"
    container_name: celery_beat
    volumes:
      - ./app/cv_baza:/app
    depends_on:
      - b2_app
      - b2_db
      - rabbitmq
    hostname: celery_beat
    image: cvbaza_v2_b2_app
    restart: on-failure
  memcached:
    container_name: b2_memcached
    ports: 
      - "11211:11211"
    image: memcached:latest

networks:
  default: 

In this configuration, hitting any route that is supposed to start a task just hands the request until it eventually times out. Example of route

class ParseCSV(views.APIView):
    parser_classes = [MultiPartParser, FormParser]

    def post(self, request, format=None):
        path = default_storage.save("./internal/documents/csv/resumes.csv", File(request.data["csv"]))
        parse_csv.delay(path)
        return Response("Task has started")

Task at hand

@shared_task
def parse_csv(file_path):
    with open(file_path) as resume_file:
        file_read = csv.reader(resume_file, delimiter=",")
        for row in file_read:
            new_resume = Resumes(first_name=row[0], last_name=row[1], email=row[2],
                                 tag=row[3], university=row[4], course=row[5], year=row[6], cv_link=row[7])
            new_resume.save()

None of the docker containers produce an error. Nothing crashes, it just times out and fails silently. Does anyone have a clue where the issue might lie?



Solution 1:[1]

Are you checking the result of the Celery task?

result = parse_csv.delay(path)
task_id = result.id

Then somewhere else (perhaps another view):

task = AsyncResult(task_id)
if task.ready():
    status_message = task.get()

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Greg Cowell