Celery: ValueError: not enough values to unpack (expected 3, got 0)

Created on 2 Aug 2017  ·  11Comments  ·  Source: celery/celery

Checklist

system info

software -> celery:4.1.0 (latentcall) kombu:4.1.0 py:3.6.2
billiard:3.5.0.3 py-amqp:2.2.1
platform -> system:Windows arch:32bit, WindowsPE imp:CPython
loader -> celery.loaders.app.AppLoader
settings -> transport:pyamqp results:redis
broker_url: 'amqp://guest:*@...:5672//'
result_backend: 'redis://:*@...*/0' I try sqlite and remove backend too, but not working

  • [x] I have included the output of celery -A proj report in the issue.
    (if you are not able to do this, then at least specify the Celery
    version affected).
  • [x] I have verified that the issue exists against the master branch of Celery.

Steps to reproduce

I follow the tutorial which on official website to run task.py in my win10,
but get error

Expected behavior

print success

Actual behavior

I name the file tasks.py to mytask.pl, so I try to run in

celery -A mytask worker -l info

It connecting OK.

BUT when I try to run the test add.delay(2,2) example , I get the error below:

[2017-08-02 19:59:04,777: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)',)
Traceback (most recent call last):
File "d:\python\python36-32\lib\site-packages\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(args, *kwargs)))
File "d:\python\python36-32\lib\site-packages\celery\app\trace.py", line 525, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)
[2017-08-02 20:04:30,870: INFO/MainProcess] Received task: mytask.hello[ec84d3ba-98ac-44bc-be5e-09190c2712e0]
[2017-08-02 20:04:30,873: ERROR/MainProcess] Task handler raised error: ValueError('not enough values to unpack (expected 3, got 0)',)
Traceback (most recent call last):
File "d:\python\python36-32\lib\site-packages\billiard\pool.py", line 358, in workloop
result = (True, prepare_result(fun(args, *kwargs)))
File "d:\python\python36-32\lib\site-packages\celery\app\trace.py", line 525, in _fast_trace_task
tasks, accept, hostname = _loc
ValueError: not enough values to unpack (expected 3, got 0)

My Solution

Try to uninstall celery 4.1.0 and replace to 3.1.24

pip uninstall celery
pip install celery==3.1.24

Than It Work Fine For Me ! Everything is OK! I think this information is useful to you

This version report:

software -> celery:3.1.24 (Cipater) kombu:3.0.37 py:3.6.2
billiard:3.3.0.23 py-amqp:1.4.9
platform -> system:Windows (Win10) arch:32bit, WindowsPE imp:CPython
loader -> celery.loaders.app.AppLoader
settings -> transport:pyamqp results:redis://:@**/0

Why 3.1.24 ?

It's my guess , Just looking for a lower version than 4

Prefork Workers Pool Bug Report Windows Duplicate ✘

Most helpful comment

I found a workaround:

celery -A your_app_name worker --pool=solo -l info

All 11 comments

Not sure if related, but windows is no longer supported in celery 4.

Also seeing on Windows in the following environment (working fine on my Mac):

  • Celery 4.1.0
  • Azure App Service 64bit
  • Python 3.6.1 (via Python extension)
  • Azure Redis for broker & backend

Windows support is provided on a best-effort basis. We do not have an active maintainer that is interested in providing support for Windows.
The unit tests pass so there must be a more complex issue here.
I don't have access to a Windows machine. If any of you can debug and issue a PR that would be lovely.

Actually this is a duplicate of #4081. There's a fix for it in #4078 that is pending for test coverage.
If any of you wants to help resolving this issue we need an integration test that proves the fix works as expected.

I found a workaround:

celery -A your_app_name worker --pool=solo -l info

While solo-pooling works, it is a single threaded execution pool, which means that there is no concurrency at all.

Another working solution is to use eventlet (pip install eventlet -> celery -A your_app_name worker --pool=eventlet). This way it is possible to have parallel-running tasks on Windows.

Confirming the solution by @np-8 on Windows Server 2012 R2, with Python 3.5. We had to bump billiard to the latest patch version to fix a pickle error too though.

@np-8 solution worked for me too.
Specifications:

  • Windows 10
  • Celery 4.1
  • RabbitMQ 3.7.4
  • Python 3.1
  • Django 2.0

It doesn't have parallelism but at least works for testing, as i'm developing in Windows (The app will be in a Linux platform for production).

Eventually we ditched the solution and are not running celery on Windows anymore. Sometimes, when saving files in Django (models.FileField), something in eventlet would error (sorry, don't have the exact error at hand). But, it seems to be in the os.py monkeypatching.

Another thing is that starting up celery takes ages (10-15 minutes), and during this startup time, no tasks seem to be picked up.

pip install eventlet
celery -A worker -l info -P eventlet

This works on
window 10 + celery 4.1 + python 3

#4078 may help

at worker side:
set FORKED_BY_MULTIPROCESSING = 1
then
celery -A myworker worker --loglevel=info
done!

Was this page helpful?
0 / 5 - 0 ratings