my task.py file from celery import Task from random import randint @shared_task def create_nb(): a = randint(1, 100) b = randint(1, 100) return {"
Killing all celery processes involves 'grep'ing on 'ps' command and run kill command on all PID. Greping on ps command results in showing up self process info o
Working on getting Celery setup with a mongodb as result_backend. Following the configuration guidelines set out in the official docs, my celeryconfig.py is set
I have a django app that uses celery to run tasks. Sometimes, I have a "hard shutdown" and a bunch of models aren't cleaned up. I created a task called clean_up
Here is the workflow I'm trying to achieve with celery: * download a big file * split it in chunks (to files) * process each chunk independantly * notifications
I have a FastAPI api code that is executed using uvicorn. Now I want to add a queu system, and I think Celery and Flower can be great tools for me since my api
I'm trying to run a simple periodic task every 10 seconds using flask and celery with the following code in my controllers.py: @celery.task() def print_hello(wo
Directory structure: Here is my cw_manage_integration/psa_integration/api_service/sync_config/init.py: from celery import Celery from kombu import Queue from
I am using feedparser feedparser=6.0.2 to parse some rss resource in Python 3.10, when I using feedparser to get the response in the CentOS 7.x, the feedparser
Whenever I run celery -A reminders worker -l INFO --detach, I get the following error: zsh: command not found: celery My assumption is that the bug lies in my p
When running the following simple celery task I always get 'client unexpectedly closed TCP connection' warnings in the RabbitMQ log output. from celery import
I deployed the latest airflow on a centos 7.5 vm and updated sql_alchemy_conn and result_backend to postgres databases on a postgresql instance and designated m
I am using Celery,RabbitMq,FastAPI,docker to develop a application.This is my docker-compose-yml. services: rabbitmq: container_name: rabbitmq
I want to pull data from a crypto exchange API. For that I would run my code in GKE. The API is limited at 20 requests per second. But if I would run my program
With Django 1.8 I used Django-celery to run asynchronous tasks and I was able to debug them in my IDE (either PyCharm or Eclipse+PyDev) just launching "python c
time="2017-10-27T07:39:20Z" level=error msg="Can't add file /var/app/current/app/content_classifier/forest.pickle to tar: io: read/write on closed pipe" time="
If I run celery manually, from within my django app's virtual environment, it works: (hackerspace) 90158@hackerspace:~/hackerspace/src$ celery -A hackerspace_o
In a django 2.0 app I have a model, called Document, that uploads and saves an image to the file system. That part works. I am performing some facial recognitio
I want to debugging the celery(celery==5.1.2) task in PyCharm(PyCharm 2021.3.1 (Professional Edition)) right now. So I configure the PyCharm debugging like this
I am trying to debugging the celery in PyCharm, when I start the celery in PyCharm the task did not trigger the breakpoint. This is the PyCharm config look like