'Celery raises ValueError: not enough values to unpack
Trying to run simple example with Celery and receiving an exception. RabbitMQ started in a Docker, also tried to start it locally. Celery works on a local Windows host
from celery import Celery
app = Celery('tasks', broker='amqp://192.168.99.100:32774')
@app.task()
def hello():
print('hello')
if __name__ == '__main__':
hello.delay()
Excerpt of my error text:
[2017-08-18 00:01:08,632: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)',)
Traceback (most recent call last):
File "c:\users\user\celenv\lib\site-packages\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(*args, **kwargs)))
File "c:\users\user\celenv\lib\site-packages\celery\app\trace.py", line 525, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
Solution 1:[1]
Celery 4.0+ does not officially support Windows yet. But it still works on Windows for some development/test purposes.
Use eventlet instead as below:
pip install eventlet
celery -A <module> worker -l info -P eventlet
It works for me on Windows 10 + celery 4.1 + python 3.
===== update 2018-11 =====
Eventlet has an issue on subprocess.CalledProcessError:
https://github.com/celery/celery/issues/4063
https://github.com/eventlet/eventlet/issues/357
https://github.com/eventlet/eventlet/issues/413
So try gevent instead.
pip install gevent
celery -A <module> worker -l info -P gevent
This works for me on Windows 10 + celery 4.2 + python 3.6
Solution 2:[2]
I got this error on Windows 7 32bit system. So I did this to make it work.
Add this
`os.environ.setdefault('FORKED_BY_MULTIPROCESSING', '1')`
before defining a celery instance in myproj/settings.py file in your django project.
It should like like
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproj.settings')
os.environ.setdefault('FORKED_BY_MULTIPROCESSING', '1')
app = Celery('tasks', broker='redis://127.0.0.1:6379/0')
I am using redis as a messaging broker so defined broker='redis://127.0.0.1:6379/0'
Solution 3:[3]
For Celery 4.1 on Windows.
Set an environment variable FORKED_BY_MULTIPROCESSING=1. Then you can simply run celery -A <celery module> worker.
Solution 4:[4]
It worked for me:
celery -A my_project_name worker --pool=solo -l info
basically things become single threaded and are suppoted
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | pushrbx |
| Solution 2 | Lawrence Gandhar |
| Solution 3 | Bob Jordan |
| Solution 4 | Sourabh SInha |
