'Celery task is pending in the browser but succeeded in the python shell of Django
I'm using Django, Celery, and RabbitMQ for simple tasks on Ubuntu but celery gives no response.
I can't figure out why the task is pending in the browser, while it is done when I used the shell by executing python3 manage.py shell.
Here is my tasks.py file:
from celery import shared_task, task
@shared_task
def createContainer(container_data):
    print(container_data,"create")
    return "created"
@shared_task
def destroyContainer(container_data):
    print(container_data,"destroy")
    return "destroyed"
Here is my views.py file:
def post(self,request):
        if str(request.data["process"]) == "create":
            postdata = {
                "image_name" : request.data["image_name"],
                "image_tag" : request.data["image_tag"],
                "owner" : request.user.id
            }
            # I tried to print the postdata variable before the task and it is working
            createContainer.delay(postdata)
        
        elif str(request.data["process"]) == "destroy":
            postdata = {
                "cont_id" : request.data["cont_id"]
            }
            # I tried to print the postdata variable before the task and it is working
            destroyContainer.delay(postdata)
        # I tried to print anything here, but it was not reachable and never executed
Here is the code I tried in the shell:
>>> from dockerapp.tasks import create_container
>>> create_container.delay("fake data")
>>> <AsyncResult: c37c47f3-6965-4f2e-afcd-01de60f82565>
Also, I can see the logs of celery here in another terminal by executing celery -A dockerproj worker -l info
It results in these lines when I used the shell:
Received task: dockerapp.tasks.create_container[c37c47f3-6965-4f2e-afcd-01de60f82565]
fake data #print
create #print
Task dockerapp.tasks.create_container[c37c47f3-6965-4f2e-afcd-01de60f82565] succeeded in 0.003456833990640007s
but it shows no results when I use it with the browser in a POST request.
However, I saw many solutions adding some lines to the settings.py file as celery configurations
I tried all of these lines:
CELERY_BROKER_URL = 'amqp://127.0.0.1'
CELERY_TIMEZONE = 'UTC'
CELERY_TRACK_STARTED = True
CELERY_TASK_TRACK_STARTED = True
CELERY_CACHE_BACKEND = 'amqp'
CELERY_TASK_TIME_LIMIT = 30 * 60
CELERY_IGNORE_RESULT = False
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TASK_SERIALIZER = 'json'
CELERY_ACCEPT_CONTENT = ['json']
I even tried this with celery terminal:
celery -A dockerproj worker -l info -P threads
celery -A dockerproj worker -l info --pool=solo (people said it fixed the issue in windows, however, i tried it)
							
						Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source | 
|---|
