<p>los trabajadores de apio se cuelgan aleatoriamente en IPC (lo que hace que todos los trabajos se pongan en cola, pero ya no se ejecuten)</p>

Creado en 7 ago. 2017  ·  109Comentarios  ·  Fuente: celery/celery

Falla de la misma manera en múltiples versiones de apio:
apio versión 3.1.7 billar versión 3.3.0.23
apio versión 3.1.25 billar versión 3.3.0.23

celery worker -l debug --autoscale=50,4 -Ofair -Q tower_scheduler,tower_broadcast_all,tower,localhost -n celery@localhost

pasos para reproducir

Ocurre esporádicamente cuando ejecutamos nuestras pruebas de integración nocturnas de 8 horas en Ubuntu 14.04 y 16.04. No ocurre en RHEL 7 ni en Centos 7. Estamos trabajando en un conjunto de pasos de recreación más pequeños.

Comportamiento esperado

  • Para que los trabajadores sigan procesando el trabajo.

Comportamiento real

El proceso maestro de apio y un trabajador en particular se bloquean en el mismo descriptor de archivo, realizando un read() . El trabajador ha terminado un trabajo y está listo para trabajar más. El padre está "encerrado", bloqueando la llamada al sistema read() compartida por el trabajador.

Reiniciar el apio "soluciona" el problema. Más quirúrgicamente, enviar un SIGUSR1 al proceso maestro "soluciona" el problema separándolo de la llamada al sistema read() . Luego, el niño regresa de la lectura y parece procesar un / el mensaje pendiente. El proceso maestro NO intenta read() desde el TUBO nuevamente.

https://github.com/celery/celery/blob/3.1/celery/concurrency/asynpool.py#L220 es donde estamos colgados. Esta es una llamada a __read__ obtenida de https://github.com/celery/billiard/blob/3.3/Modules/_billiard/multiprocessing.c#L202
Esto es no bloqueante y no asincrónico. Realmente no entiendo cómo este código no es susceptible a un punto muerto / escenario de bloqueo infinito. El read no es bloqueante y se llama de tal manera que puede bloquearse para siempre si el niño muere.

Ahora estamos intentando recrear el problema con CELERY_RDBSIG="1" set para poder saltar a una sesión de depuración remota cuando se produzca el interbloqueo.

Cualquier consejo sería útil.

Desde la perspectiva del sistema operativo, no puedo razonar cómo podría ocurrir esto.

  • No se observa write() después del SIGUSR1.
  • 2 procesos bloqueados en el mismo PIPE mediante un read (). Esto significa que no hay datos en la canalización o no hay suficientes datos.
  • Un proceso "se rinde" (porque enviamos el SIGUSR1)
  • Luego, los otros procesos read () continúan con los datos ... sin que se observe una escritura ().
Billiard Bug Report Critical Needs Test Coverage ✘ Worker Hangs

Comentario más útil

Bien, resulta que usar poll () no resuelve el problema. Ya que el problema real es que se cierra un descriptor de archivo mientras aún está registrado en el concentrador. Una vez que se dé cuenta de que es fácil averiguar exactamente dónde va mal y el siguiente truco nos soluciona este problema. Sin esto, podríamos reproducir el problema en una o dos horas. Con él aún no hemos podido activarlo. El truco no es la solución correcta, pero realmente requiere que alguien que conozca mejor el código escriba una buena solución.

--- /usr/local/lib/python2.7/dist-packages/celery/concurrency/asynpool.py~  2018-03-16 01:10:04.000000000 +0000
+++ /usr/local/lib/python2.7/dist-packages/celery/concurrency/asynpool.py   2018-03-16 12:08:58.058686003 +0000
@@ -735,6 +735,7 @@
             except KeyError:
                 pass
         self.on_inqueue_close = on_inqueue_close
+        self.hub_remove = hub_remove

         def schedule_writes(ready_fds, total_write_count=[0]):
             # Schedule write operation to ready file descriptor.
@@ -1241,6 +1242,7 @@
             if queue:
                 for sock in (queue._reader, queue._writer):
                     if not sock.closed:
+                        self.hub_remove(sock)
                         try:
                             sock.close()
                         except (IOError, OSError):

Parece que la cola de escritura fd se eliminó correctamente del concentrador (algunas líneas hacia arriba), pero la cola de lectura fd no.

Todos 109 comentarios

Tengo el mismo problema aquí, los trabajadores parecen estar corriendo pero no están procesando mensajes.

screen shot 2017-08-08 at 6 55 12 pm

Estoy ejecutando Python 3.6 y uso Celery 4.1.0

@codeadict , ¿puede controlar los dos procesos (trabajador y padre) para ver si ambos están atascados en el mismo descriptor de archivo?

Puede verificar que es el mismo fd con sudo lsof | grep 53r donde 53r es el número del descriptor de archivo en el que están bloqueados los procesos.

@codeadict como mencionó @chrismeyersfsu , puede strace el proceso maestro del apio y ver si está atascado leyendo en una tubería:

$ sudo strace -p <pid-of-celery-master-process> -s 100
Process X attached
read(53

Si tiene python-dbg instalado con símbolos de Python, también puede usar gdb para ver dónde está atascado el maestro:

$ gdb python -p <pid-of-master-process>
...
(gdb) py-list
 216            # header
 217
 218            while Hr < 4:
 219                try:
 220                    n = __read__(
>221                        fd, bufv[Hr:] if readcanbuf else bufv, 4 - Hr,

https://github.com/celery/celery/blob/v3.1.17/celery/concurrency/asynpool.py#L219

El mismo problema.

apio == 3.1.23, billar == 3.3.0.23
Los trabajadores comenzaron con opciones: -time-limit=7200 --autoscale=64,2 --maxtasksperchild=1
pitón 2.7.3

Después de unas horas, el apio se 'congela'
Veo que el maestro y el niño esperan infinito leer () en las tuberías (por cierto, tuberías diferentes).

Entonces, este es el proceso maestro:

root<strong i="13">@vm1</strong>:~# strace -p 41840
Process 41840 attached - interrupt to quit
read(50, ^C <unfinished ...>
Process 41840 detached

root<strong i="14">@vm1</strong>:~# sudo lsof | grep 50r
python     41840                 root   50r     FIFO                0,8        0t0     548796 pipe
python     46237                 root   50r      CHR                1,3        0t0       7604 /dev/null
python     46304                 root   50r     FIFO                0,8        0t0     548796 pipe


root<strong i="15">@vm1</strong>:~# ls -lh /proc/41840/fd/50
lr-x------ 1 root root 64 Oct 20 10:02 /proc/41840/fd/50 -> pipe:[548796]

root<strong i="16">@vm1</strong>:~# (find /proc -type l | xargs ls -l | fgrep 'pipe:[548796]') 2>/dev/null
lr-x------ 1 root    root    64 Oct 20 10:02 /proc/41840/fd/50 -> pipe:[548796]
lr-x------ 1 root    root    64 Oct 20 10:02 /proc/41840/fd/51 -> pipe:[548796]
lr-x------ 1 root    root    64 Oct 20 10:07 /proc/41840/task/41840/fd/50 -> pipe:[548796]
lr-x------ 1 root    root    64 Oct 20 10:07 /proc/41840/task/41840/fd/51 -> pipe:[548796]
l-wx------ 1 root    root    64 Oct 20 10:02 /proc/46237/fd/51 -> pipe:[548796]
l-wx------ 1 root    root    64 Oct 20 10:07 /proc/46237/task/46237/fd/51 -> pipe:[548796]
lr-x------ 1 root    root    64 Oct 20 10:02 /proc/46304/fd/50 -> pipe:[548796]
lr-x------ 1 root    root    64 Oct 20 10:02 /proc/46304/fd/51 -> pipe:[548796]
lr-x------ 1 root    root    64 Oct 20 10:07 /proc/46304/task/46304/fd/50 -> pipe:[548796]
lr-x------ 1 root    root    64 Oct 20 10:07 /proc/46304/task/46304/fd/51 -> pipe:[548796]

Y uno de los niños:

root<strong i="5">@vm1</strong>:~# strace -p 46304
Process 46304 attached - interrupt to quit
read(16, ^C <unfinished ...>
Process 46304 detached

root<strong i="6">@vm1</strong>:~# ls -lh /proc/46304/fd/16
lr-x------ 1 root root 64 Oct 20 10:02 /proc/46304/fd/16 -> pipe:[482281]


root<strong i="7">@vm1</strong>:~# (find /proc -type l | xargs ls -l | fgrep 'pipe:[482281]') 2>/dev/null
lr-x------ 1 root    root    64 Oct 20 10:02 /proc/41840/fd/16 -> pipe:[482281]
l-wx------ 1 root    root    64 Oct 20 10:02 /proc/41840/fd/17 -> pipe:[482281]
lr-x------ 1 root    root    64 Oct 20 10:07 /proc/41840/task/41840/fd/16 -> pipe:[482281]
l-wx------ 1 root    root    64 Oct 20 10:07 /proc/41840/task/41840/fd/17 -> pipe:[482281]
lr-x------ 1 root    root    64 Oct 20 10:02 /proc/46237/fd/16 -> pipe:[482281]
l-wx------ 1 root    root    64 Oct 20 10:02 /proc/46237/fd/17 -> pipe:[482281]
lr-x------ 1 root    root    64 Oct 20 10:07 /proc/46237/task/46237/fd/16 -> pipe:[482281]
l-wx------ 1 root    root    64 Oct 20 10:07 /proc/46237/task/46237/fd/17 -> pipe:[482281]
lr-x------ 1 root    root    64 Oct 20 10:02 /proc/46304/fd/16 -> pipe:[482281]
lr-x------ 1 root    root    64 Oct 20 10:07 /proc/46304/task/46304/fd/16 -> pipe:[482281]
root<strong i="8">@vm1</strong>:~# 

Y este cuadro actual 'congelado' en el proceso maestro:

Thread 0x7efc1c101700
('Frame: ', 63347776)
  File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
    exec code in run_globals
  File "/opt/waf/python/lib/python2.7/site-packages/celery/__main__.py", line 54, in <module>
    main()
  File "/opt/waf/python/lib/python2.7/site-packages/celery/__main__.py", line 30, in main
    main()
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/celery.py", line 81, in main
    cmd.execute_from_commandline(argv)
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/celery.py", line 793, in execute_from_commandline
    super(CeleryCommand, self).execute_from_commandline(argv)))
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/base.py", line 311, in execute_from_commandline
    return self.handle_argv(self.prog_name, argv[1:])
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/celery.py", line 785, in handle_argv
    return self.execute(command, argv)
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/celery.py", line 717, in execute
    ).run_from_argv(self.prog_name, argv[1:], command=argv[0])
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/worker.py", line 179, in run_from_argv
    return self(*args, **options)
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/base.py", line 274, in __call__
    ret = self.run(*args, **kwargs)
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bin/worker.py", line 212, in run
    state_db=self.node_format(state_db, hostname), **kwargs
  File "/opt/waf/python/lib/python2.7/site-packages/celery/worker/__init__.py", line 206, in start
    self.blueprint.start(self)
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bootsteps.py", line 123, in start
    step.start(parent)
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bootsteps.py", line 374, in start
    return self.obj.start()
  File "/opt/waf/python/lib/python2.7/site-packages/celery/worker/consumer.py", line 279, in start
    blueprint.start(self)
  File "/opt/waf/python/lib/python2.7/site-packages/celery/bootsteps.py", line 123, in start
    step.start(parent)
  File "/opt/waf/python/lib/python2.7/site-packages/celery/worker/consumer.py", line 838, in start
    c.loop(*c.loop_args())
  File "/opt/waf/python/lib/python2.7/site-packages/celery/worker/loops.py", line 76, in asynloop
    next(loop)
  File "/opt/waf/python/lib/python2.7/site-packages/kombu/async/hub.py", line 340, in create_loop
    cb(*cbargs)
  File "/opt/waf/python/lib/python2.7/site-packages/celery/concurrency/asynpool.py", line 279, in on_result_readable
    next(it)
  File "/opt/waf/python/lib/python2.7/site-packages/celery/concurrency/asynpool.py", line 221, in _recv_message
    fd, bufv[Hr:] if readcanbuf else bufv, 4 - Hr,
()

@harlov @codeadict @chrismeyersfsu ¿Encontraste alguna solución?

No. Estamos actualizando a apio 4. Se ejecutará el mismo conjunto de pruebas que invoca este error y veremos si podemos recrear en apio 4.

@korycins eliminamos - autoscale y - maxtasksperchild - todo bien todavía.

Supongo que los trabajadores que reaparecen en curso con autoscale ® maxtasksperchild pueden causar esta situación.

@chrismeyersfsu Tengo el mismo problema con el apio 4.1, así que supongo que cambiar a v4.1 no te ayudará.
Anteriormente usé solo la opción --autoscale. Los trabajadores aguantan de vez en cuando (alrededor de una vez por semana) pero cuando agregué maxtasksperchild = 20, los trabajadores aguantan una vez por hora. Pensé que ese problema podía estar en maxtasksperchild, así que lo eliminé.
Creé una escala automática personalizada para escalar hacia arriba / hacia abajo dinámicamente en función de los recursos gratuitos y ocurre el mismo problema, así que supongo que el problema está relacionado con la reaparición de los trabajadores (como dijo @harlov )

¿Alguien puede verificar los problemas además del último maestro? No estoy seguro de si ya está arreglado con la última confirmación.

@auvipy, ¿ qué confirmaciones sospecha que podrían resolver esto? Actualmente puedo reproducir este problema en apio 4.1.0, billar 3.5.0.3 de PyPI.

No estoy seguro, ¿podría probar la rama maestra? Si después de la liberación 4.2 todavía existe, se trabajará para arreglar

@auvipy , podríamos intentarlo, pero puede ser difícil decirlo con certeza: _personalmente_, parece que solo nos encontramos con este bloqueo una vez cada pocas _ semanas_, por lo que es bastante difícil reducirlo.

@ryanpetrello Intente agregar max_task_per_child y configúrelo en 1. http://docs.celeryproject.org/en/latest/userguide/workers.html#max -tasks-per-child-setting
Esto debería aumentar la posibilidad de que ocurra un problema

El mismo problema, no se puede reproducir en este momento, pensé que era un problema de configuración, cuando reiniciar simplemente desconecta el trabajador, pero parece que sucede de vez en cuando.

Solo intervengo para decir que también nos encontramos con este problema. apio 4.1.0, kombu 4.1.0, billar 3.5.0.3.
Línea de comando: /usr/bin/python /usr/local/bin/celery worker --app myapp.celery -B --loglevel INFO --autoscale=5,1 --max-tasks-per-child=50 --max-memory-per-child=1000000 -O fair

Con respecto al bloqueo de descriptores de archivos, vemos lo mismo que @harlov arriba: el maestro está en un punto muerto con uno de sus trabajadores.

Al enviar SIGUSR1 al maestro de apio, se "despega" el proceso y se descarga el rastro de la pila a continuación. Lo sospechoso es que en una función "on_result_readable" se atasca en una lectura. Esto se siente como una condición de carrera clásica: poll () devuelve legible pero los bloques read (). Esto puede suceder, por eso siempre se recomienda configurar los FD en modo sin bloqueo para asegurarse de que nada salga mal.

Mensaje relevante sobre LKML: https://lkml.org/lkml/2011/6/18/105

No estoy seguro de que sea fácil configurar los descriptores sin bloqueo, porque tenemos un entorno con muchos maestros de apio y siempre hay algunos atascados, si es un cambio simple, es posible que podamos probarlo rápidamente.

MainThread                                                                                                                                                                                                                            [636/660]
=================================================                                                                                                                                                    
  File "/usr/local/bin/celery", line 11, in <module>                                                                                                                                                 
    sys.exit(main())                                                                                                                                                                                 
  File "/usr/local/lib/python2.7/dist-packages/celery/__main__.py", line 14, in main                                                                                                             
    _main()                                                                                                                                                                                          
  File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 326, in main                                                                                                              
    cmd.execute_from_commandline(argv)                                                                                                                                                               
  File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 488, in execute_from_commandline                                                                                                            
    super(CeleryCommand, self).execute_from_commandline(argv)))                                                                                                                                      
  File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 281, in execute_from_commandline                                                                                        
    return self.handle_argv(self.prog_name, argv[1:])                                                                                                              
  File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 480, in handle_argv                                                                                                       
    return self.execute(command, argv)                                                                                                                                                           
  File "/usr/local/lib/python2.7/dist-packages/celery/bin/celery.py", line 412, in execute                                                                                                           
    ).run_from_argv(self.prog_name, argv[1:], command=argv[0])                                                                                                                                   
  File "/usr/local/lib/python2.7/dist-packages/celery/bin/worker.py", line 221, in run_from_argv                                                                                                                       
    return self(*args, **options)                                                                                                                                                                    
  File "/usr/local/lib/python2.7/dist-packages/celery/bin/base.py", line 244, in __call__                                                                                                            
    ret = self.run(*args, **kwargs)                                                                                                                                                                  
  File "/usr/local/lib/python2.7/dist-packages/celery/bin/worker.py", line 256, in run                                                                                                          
    worker.start()                                                                                                                                                                                   
  File "/usr/local/lib/python2.7/dist-packages/celery/worker/worker.py", line 203, in start                                                                                                     
    self.blueprint.start(self)                                                                                                                                      
  File "/usr/local/lib/python2.7/dist-packages/celery/bootsteps.py", line 119, in start                                                                                                                                
    step.start(parent)                                                                                                                                                                               
  File "/usr/local/lib/python2.7/dist-packages/celery/bootsteps.py", line 370, in start                                                                                                              
    return self.obj.start()                                                                                                                                                                      
  File "/usr/local/lib/python2.7/dist-packages/celery/worker/consumer/consumer.py", line 320, in start                                                                                               
    blueprint.start(self)                                                                                                                                                                            
  File "/usr/local/lib/python2.7/dist-packages/celery/bootsteps.py", line 119, in start                                                                                                             
    step.start(parent)                                                                                                                                                                               
  File "/usr/local/lib/python2.7/dist-packages/celery/worker/consumer/consumer.py", line 596, in start                                                                                               
    c.loop(*c.loop_args())                                                                                                                                                                           
  File "/usr/local/lib/python2.7/dist-packages/celery/worker/loops.py", line 88, in asynloop                                                                                                         
    next(loop)                                                                                                                                                                                       
  File "/usr/local/lib/python2.7/dist-packages/kombu/async/hub.py", line 354, in create_loop                                                                        
    cb(*cbargs)                                                                                                                                                                                      
  File "/usr/local/lib/python2.7/dist-packages/celery/concurrency/asynpool.py", line 292, in on_result_readable                                                                                      
    next(it)
  File "/usr/local/lib/python2.7/dist-packages/celery/concurrency/asynpool.py", line 235, in _recv_message
    fd, bufv[Hr:] if readcanbuf else bufv, 4 - Hr,
  File "/usr/local/lib/python2.7/dist-packages/celery/apps/worker.py", line 349, in cry_handler
    safe_say(cry())
  File "/usr/local/lib/python2.7/dist-packages/celery/utils/debug.py", line 193, in cry
    traceback.print_stack(frame, file=out)
=================================================
LOCAL VARIABLES
=================================================
{'P': <functools.partial object at 0x7f29da0f1c58>,
 'frame': <frame object at 0x37137b0>,
 'out': <vine.five.WhateverIO object at 0x7f29da14e150>,
 'sep': u'=================================================',
 'sepchr': u'=',
 'seplen': 49,
 'thread': <_MainThread(MainThread, started 139818014390016)>,
 'threading': <module 'threading' from '/usr/lib/python2.7/threading.pyc'>,
 'tid': 139818014390016,
 'tmap': {139818014390016: <_MainThread(MainThread, started 139818014390016)>}}

Solo una actualización. Pasamos algún tiempo probando cosas e intentando reproducirlas.

  • Establecer max_tasks_per_child en 1 lo empeora
  • Eliminar -O justo no hace ninguna diferencia
  • Eliminar la escala automática tampoco ayuda
  • Hasta ahora solo lo hemos reproducido en máquinas con kernel 4.9 (Stretch), no en máquinas con 3.16 (Jessie)

Verifiqué que los descriptores de archivo que usa normalmente el apio están todos correctamente marcados como no bloqueantes y lo están. Entonces, la pregunta es, ¿cómo se bloquea el código si todos los descriptores no son bloqueantes?

La pista que sigo ahora es que el objeto epoll () se confunde acerca de los descriptores que se supone que está mirando. Si corta el apio, obtendrá muchos resultados como:

14659 14:32:56.665937 epoll_ctl(4, EPOLL_CTL_ADD, 43, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=43, u64=14879199376994992171}}) = -1 EEXIST (File exists)
14659 14:32:56.667084 epoll_ctl(4, EPOLL_CTL_ADD, 16, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=16, u64=14879199376994992144}}) = -1 EEXIST (File exists)
14659 14:32:56.667278 epoll_ctl(4, EPOLL_CTL_ADD, 26, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=26, u64=14879199376994992154}}) = -1 EEXIST (File exists)
14659 14:32:56.667440 epoll_ctl(4, EPOLL_CTL_ADD, 23, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=23, u64=14879199376994992151}}) = -1 EEXIST (File exists)
14659 14:32:56.667607 epoll_ctl(4, EPOLL_CTL_DEL, 37, 0x7ffd31c17f60) = -1 ENOENT (No such file or directory)
14659 14:32:56.667804 epoll_ctl(4, EPOLL_CTL_DEL, 38, 0x7ffd31c17f60) = -1 ENOENT (No such file or directory)

Es totalmente plausible que en algún momento un descriptor de archivo que una vez se usó para una cosa se use para otra cosa, el concentrador se confunda y llame al método incorrecto y ¡boom!

Ok, encontré el problema, y ​​he fallado la salida de Strace que lo captura. Y es muy sutil y tiene que ver con la siguiente 'característica' completamente no obvia de la interfaz epoll (). Desde la página de manual:

Q6
¿Cerrar un descriptor de archivo hará que se elimine de todos los conjuntos de epoll automáticamente?
A6
Sí, pero tenga en cuenta el siguiente punto. Un descriptor de archivo es una referencia a una descripción de archivo abierto (consulte open (2)). Siempre que se duplica un descriptor de archivo mediante dup (2), dup2 (2), fcntl (2) F_DUPFD o fork (2), se crea un nuevo descriptor de archivo que hace referencia a la misma descripción de archivo abierto. Una descripción de archivo abierto continúa existiendo hasta que se hayan cerrado todos los descriptores de archivo que se refieren a ella. Un descriptor de archivo se elimina de un conjunto de epoll solo después de que se hayan cerrado todos los descriptores de archivo que hacen referencia a la descripción del archivo abierto subyacente (o antes si el descriptor de archivo se elimina explícitamente mediante epoll_ctl (2) EPOLL_CTL_DEL). Esto significa que incluso después de que se haya cerrado un descriptor de archivo que forma parte de un conjunto de epoll, se pueden informar eventos para ese descriptor de archivo si otros descriptores de archivo que hacen referencia a la misma descripción de archivo subyacente permanecen abiertos.

Lo que está sucediendo es que apio está utilizando un descriptor de archivo para la comunicación, el descriptor de archivo se filtra a un trabajador, el maestro cierra el descriptor de archivo y luego abre otro para el centinela _que no está marcado como no bloqueante_. El proceso hijo muere, lo que hace que el descriptor del archivo se marque como legible, el maestro intenta leerlo en bloques.

Sí, epoll () devuelve legibilidad para un descriptor de archivo que ya no está abierto en este proceso.

Aquí estoy pegando la salida de strace capturada por solo fd 49 como evidencia (disculpas por la salida de varias líneas, strace estaba monitoreando múltiples procesos):

7417  15:04:00.001511 pipe( <unfinished ...>
7417  15:04:00.001646 <... pipe resumed> [49, 50]) = 0
7417  15:04:00.001727 fcntl(49, F_GETFL <unfinished ...>
7417  15:04:00.002091 <... fcntl resumed> ) = 0 (flags O_RDONLY)
7417  15:04:00.002165 fcntl(49, F_SETFL, O_RDONLY|O_NONBLOCK <unfinished ...>
7417  15:04:00.002222 <... fcntl resumed> ) = 0
7417  15:04:00.003052 fcntl(49, F_GETFL <unfinished ...>
7417  15:04:00.003195 <... fcntl resumed> ) = 0x800 (flags O_RDONLY|O_NONBLOCK)
...
7417  15:04:00.237131 fcntl(49, F_GETFL) = 0x800 (flags O_RDONLY|O_NONBLOCK)
7417  15:04:00.237363 epoll_ctl(4, EPOLL_CTL_ADD, 49, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=49, u64=12923050543437316145}} <unfinished ...>
7417  15:04:00.237478 <... epoll_ctl resumed> ) = 0
...
7417  15:04:00.274482 read(49,  <unfinished ...>
7417  15:04:00.274558 <... read resumed> "\0\0\0\25", 4) = 4
7417  15:04:00.274670 read(49,  <unfinished ...>
7417  15:04:00.274734 <... read resumed> "(I15\n(I9395\ntp0\ntp1\n.", 21) = 21
7417  15:04:00.274851 epoll_ctl(4, EPOLL_CTL_ADD, 49, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=49, u64=12923050543437316145}} <unfinished ...>
7417  15:04:00.274970 <... epoll_ctl resumed> ) = -1 EEXIST (File exists)
...
... from here fd 49 is used normally for a while,  have not included all the output ...
...
7417  15:04:52.871646 epoll_wait(4, [{EPOLLIN, {u32=49, u64=12923050543437316145}}, {EPOLLIN, {u32=33, u64=12923050543437316129}}, {EPOLLOUT, {u32=48, u64=12923050543437316144}}, {EPOLLOUT, {u32=8, u64=12923050543437316104}}], 1023, 155) = 4
7417  15:04:52.871888 read(49, "\0\0\0\31", 4) = 4
7417  15:04:52.872079 read(49, "(I4\n(I9400\nI155\ntp0\ntp1\n.", 25) = 25
... here it uses normal poll for a once?
7417  15:04:52.891818 poll([{fd=49, events=POLLERR}], 1, 10 <unfinished ...>
7417  15:04:52.902239 <... poll resumed> ) = 0 (Timeout)
7417  15:04:52.913780 close(49 <unfinished ...>
7417  15:04:52.914020 <... close resumed> ) = 0
... here we create a new pipe for a child process
7417  15:04:53.148195 pipe([49, 50])    = 0
7417  15:04:53.148996 clone(child_stack=NULL, flags=CLONE_CHILD_CLEARTID|CLONE_CHILD_SETTID|SIGCHLD, child_tidptr=0x7f2dee7f79d0) = 9482
... the write end got closed, now we dup it. This is the sentinal fd so is not marked non-blocking
7417  15:04:53.155409 dup(49)           = 50
7417  15:04:53.156030 epoll_ctl(4, EPOLL_CTL_ADD, 50, {EPOLLIN|EPOLLERR|EPOLLHUP, {u32=50, u64=12923050543437316146}}) = 0
... and here it breaks
7417  15:04:53.776974 epoll_wait(4,  <unfinished ...>
7417  15:04:53.785786 <... epoll_wait resumed> [{EPOLLHUP, {u32=49, u64=12923050543437316145}}], 1023, 1878) = 1
7417  15:04:53.785876 --- SIGCHLD {si_signo=SIGCHLD, si_code=CLD_EXITED, si_pid=9481, si_uid=905, si_status=155, si_utime=4, si_stime=3} ---
7417  15:04:53.786073 read(49,  <unfinished ...>
... discontinuity ...
7417  15:09:24.338109 <... read resumed> 0x23f4f10, 4) = ? ERESTARTSYS (To be restarted if SA_RESTART is set)
7417  15:09:24.345507 epoll_ctl(4, EPOLL_CTL_DEL, 49, 0x7ffeecbf7660) = -1 ENOENT (No such file or directory)

Observe cómo epoll () admite que fd 49 ya no está en su fdset, pero devolvió el número de todos modos.

La forma de solucionar esto es asegurarse de que en el trabajador se cierren todos los descriptores de archivos que están siendo monitoreados por epoll (). Actualmente, los trabajadores cierran los descriptores de archivo que no quieren que se correspondan con ellos mismos, pero no tocan los descriptores de archivo que heredan que se relacionan con _otros_ trabajadores. Las soluciones son marcar el fd sentinal también como no bloqueante, o simplemente usar poll () en lugar de epoll (). Estoy probando este último localmente.

Bien, resulta que usar poll () no resuelve el problema. Ya que el problema real es que se cierra un descriptor de archivo mientras aún está registrado en el concentrador. Una vez que se dé cuenta de que es fácil averiguar exactamente dónde va mal y el siguiente truco nos soluciona este problema. Sin esto, podríamos reproducir el problema en una o dos horas. Con él aún no hemos podido activarlo. El truco no es la solución correcta, pero realmente requiere que alguien que conozca mejor el código escriba una buena solución.

--- /usr/local/lib/python2.7/dist-packages/celery/concurrency/asynpool.py~  2018-03-16 01:10:04.000000000 +0000
+++ /usr/local/lib/python2.7/dist-packages/celery/concurrency/asynpool.py   2018-03-16 12:08:58.058686003 +0000
@@ -735,6 +735,7 @@
             except KeyError:
                 pass
         self.on_inqueue_close = on_inqueue_close
+        self.hub_remove = hub_remove

         def schedule_writes(ready_fds, total_write_count=[0]):
             # Schedule write operation to ready file descriptor.
@@ -1241,6 +1242,7 @@
             if queue:
                 for sock in (queue._reader, queue._writer):
                     if not sock.closed:
+                        self.hub_remove(sock)
                         try:
                             sock.close()
                         except (IOError, OSError):

Parece que la cola de escritura fd se eliminó correctamente del concentrador (algunas líneas hacia arriba), pero la cola de lectura fd no.

¿No hay actualizaciones aquí? :(

@auvipy alguna idea sobre @kleptog 's análisis anterior al https://github.com/celery/celery/issues/4185#issuecomment -373755206?

lo siento, no tengo tiempo suficiente en este momento para investigar el problema

@kleptog gracias por tanto trabajo

Parece que on_inqueue_close _itself_ llama hub_remove(fd) , específicamente en L744 , similar a su ejemplo:

https://github.com/celery/celery/blob/b46bea25539cc26a76f0a491b95f4899f5b32c34/celery/concurrency/asynpool.py#L735 -L745

... que se llama en destroy_queues() en L1247 , pero solo por queues[0] .

https://github.com/celery/celery/blob/b46bea25539cc26a76f0a491b95f4899f5b32c34/celery/concurrency/asynpool.py#L1245 -L1249

Entonces, ¿esta declaración es realmente precisa?

El truco no es la solución correcta, pero realmente requiere que alguien que conozca mejor el código escriba una buena solución.

No conozco lo suficiente sobre los componentes internos del apio para saber por qué el código actualmente _sólo_ hace esto por queues[0] . Podría valer la pena una solicitud de extracción con su diferencial, al menos.

IIRC, de hecho, era la otra cola la que se estaba dejando fuera, así que tal vez sea una cuestión de hacer un inqueue_close en la otra cola. Sin embargo, no tendré la oportunidad de probarlo por un tiempo.

@auvipy Lamento ser un fastidio y puedo simpatizar con la tarea, a menudo ingrata, de hacer un trabajo gratuito en un proyecto de código abierto 😞: trabajo en varios en mi tiempo libre y en otro para mi trabajo diario.

¿Sabes si tú o cualquier otro mantenedor de apio ha tenido la oportunidad de echar un vistazo a este? Tenemos un número notable de usuarios de apio en este hilo que se han encontrado con este problema debilitante desde que se informó el verano pasado, e incluso algunos que han contribuido y posiblemente identificado el área de código que es el culpable. ¿Hay algún tipo de ETA sobre cuándo se podría analizar este error?

@ryanpetrello @kleptog He

queues en este contexto es una tupla inq, outq, synq devuelta por AsynPool.create_process_queues() . Entonces queue[0] refiere específicamente a inq , y tiene sentido que una función llamada on_inqueue_close se llame solo para eso.

Entonces, en este momento, lo único que se está pasando a hub_remove() es inq._writer , no su _reader , y no el outq o synq escritores o lectores, y SOLO cuando es una clave en fileno_to_inq cuyo valor es proc .

No tengo mucha información sobre esto. Pero también hay dictados fileno_to_outq y fileno_to_synq . ¿Deberíamos comprobar su contenido antes de llamar a hub_remove() en esos otros identificadores? ¿O deberíamos estar incondicionalmente hub_remove ing todo? ¿Quizás deberíamos eliminar la llamada hub_remove() de on_inqueue_close() y simplemente llamarla en destroy_queues() como en la diferencia anterior?

@kleptog
Gracias por el increíble trabajo. Hemos tenido este problema por un tiempo, y tuvo el desafortunado efecto en nuestro sistema de que se descartaran las tareas precargadas. Encontré la misma causa raíz: trabajadores y esclavos colgados de FD, pero no pude encontrar la condición de bloqueo. La aplicación de su corrección funcionó y nuestra prueba de esfuerzo ahora procesa sin esfuerzo ~ 50k tareas sin colgar.

@AlexHill dijo:

¿Deberíamos comprobar su contenido antes de llamar a hub_remove () en esos otros identificadores? ¿O deberíamos estar incondicionalmente hub_removeing ​​todo? ¿Quizás deberíamos eliminar la llamada hub_remove () de on_inqueue_close () y simplemente llamarlo en destroy_queues () como en el diff anterior?

Muy simplemente, debe eliminar un descriptor de archivo del concentrador antes de cerrarlo, porque el cierre no lo eliminará del concentrador por usted. Así que tu última sugerencia es bastante buena, ya que es más "obviamente correcta" y soy un gran admirador de eso.

Usando strace veo lo siguiente:

epoll_ctl (5, EPOLL_CTL_DEL, 16, 0x7ffe54f25410) = -1 ENOENT (No existe tal archivo o directorio)
epoll_ctl (5, EPOLL_CTL_DEL, 21, 0x7ffe54f25410) = -1 ENOENT (No existe tal archivo o directorio)
fcntl (26, F_GETFL) = 0x2 (banderas O_RDWR)
fcntl (26, F_SETFL, O_RDWR | O_NONBLOCK) = 0
recvfrom (26, 0x7f42ca4409e4, 7, 0, NULL, NULL) = -1 EAGAIN (Recurso temporalmente no disponible)

¿Está relacionado con este tema?

Gracias

tal vez ayude a diagnosticar el problema con el siguiente enlace: http://memos.ml/

Tengo el mismo problema desde que actualicé a Ubuntu 16

Estoy enfrentando el mismo problema en 4.2.1.

intente configurar el apio con la opción de tiempo de espera del socket de la siguiente manera:

celery.config_from_object ({'broker_transport_options': {'socket_timeout': 30}})

Tuve el mismo problema. Sin embargo, después de que se publicó el parche superior, el problema se resuelve.

También tengo este problema, ¿hay una solicitud de extracción con la solución de o la gente está usando una bifurcación hasta que se resuelva?

Nuestro problema se resolvió después de agregar tcp socket keepalive y timeout (usando redis)

BROKER_TRANSPORT_OPTIONS = {'socket_timeout': 60,
                            'socket_connect_timeout': 120,
                            'socket_keepalive': True,
                            'socket_keepalive_options': {socket.TCP_KEEPIDLE: 60,
                                 socket.TCP_KEEPINTVL: 60,
                                 socket.TCP_KEEPCNT: 3}}

@ linar-jether: ¿qué es este socket en socket.TCP_KEEPIDLE ?

@ nishith107 import socket

Es casi seguro que cambiar la configuración del corredor de redis no está relacionado con el problema de este hilo, que se trata de quedarse atascado en una lectura de bloqueo en el canal de comunicación maestro-hijo.

Correcto, es posible que modificar la configuración del corredor pueda introducir suficiente jitter para _eludir_ el problema subyacente, pero el bloqueo de IPC maestro-hijo es un error distintivo.

@auvipy, esto se informó originalmente hace un año hoy. Yo y otros hemos dejado información detallada y análisis de dónde parece colgarse el código, incluida la salida de strace donde lo hemos reducido a unas pocas líneas de código. Alguien incluso ha publicado un enlace a un parche (
https://github.com/celery/celery/issues/4185#issuecomment-373755206) que parece que las personas están usando e implementando (¿en producción?) como solución alternativa. Muchas personas involucradas en este hilo de comentarios todavía parecen encontrar esto regularmente en una gran cantidad de versiones de apio y configuraciones de corredores.

Aprecio que este es un proyecto de código abierto, y todos los involucrados son voluntarios, así que no tengo la intención de regañarlos sin cesar. ¿Sabe si alguien se ha esforzado por investigar o diagnosticar este problema crítico?

Sería más fácil para nosotros solucionar los problemas si alguien de nosotros lo enfrenta en producción. Dado que algunos de ustedes usaron alguna solución para deshacerse de este problema, ¿por qué no enviar un PR con los casos de prueba apropiados? el apio también tiene problemas que tienen entre 5 y 6 años :(

Créame, es realmente difícil lidiar con todos los problemas que surgen. por favor, tome la responsabilidad de enviar una solución, ya que algunos de ustedes lo están enfrentando en producción, por lo que sería más fácil para ustedes probar los cambios propuestos en el entorno real.

@ linar-jether Gracias, esta solución funcionó para mí :)

¿Podrían ver este PR https://github.com/celery/celery/pull/4997 ? si esto soluciona su problema, podríamos fusionarlo.

Probaré esta rama. Me tomará un poco de tiempo saber si está ordenado, ya que no he encontrado una manera de forzarlo a que ocurra.

¿Alguna noticia sobre la prueba de esta corrección de errores?

No he tenido el problema durante más de 2 semanas. Aún no estoy al 100%, pero creo que las relaciones públicas sugeridas me lo han arreglado.

Acabamos de implementar la rama 4.2 con este parche aplicado como una rueda personalizada en 3 máquinas cliente ... ahora está en producción. Este error estaba paralizando a los trabajadores una vez a la semana durante meses. Está desplegado. Veremos si evita el problema.

así que estoy fusionando el pr https://github.com/celery/celery/pull/4997 por favor manténganos informados con sus problemas de producción aquí, y si alguien tiene alguna sugerencia de pruebas de unidad / integración, proceda con un PR.

@ryanpetrello preguntándose, ¿implementó este parche y lo probó?

@AvnerCohen debido a esto y algunos otros problemas, terminamos simplemente resolviendo nuestro problema sin usar apio (pudimos lograr lo que necesitábamos con el viejo kombu). ¡Sin embargo, parece que otros han tenido suerte con este parche!

Hasta ahora no tenemos más problemas desde que lo implementamos en producción. Creo que lo arregló.

¡genial gracias!

@auvipy ¿ Cuándo deberíamos esperar un lanzamiento formal que incluya este parche?

mi objetivo es antes de navidad

¡Gracias @kleptog por una excelente depuración!

Probé todas las correcciones sugeridas, hasta ejecutar Celery y amigos del maestro, pero tan pronto como un trabajador encuentra un ConnectionResetError, se niega a procesar más tareas. Al reiniciar RabbitMQ, los trabajadores atascados vuelven a procesar las tareas. Naturalmente, esta no es una buena solución, pero me indica una dirección para depurar más.

Estamos usando este parche, pero todavía nos encontramos con este problema de vez en cuando: el trabajador termina con éxito su trabajo y el proceso principal está bloqueado esperando resultados.

Estoy usando el apio == 4.2.1; python == 3.6.7 ubuntu 16.04. y Redis como backend
Usé supervisor para iniciar el apio con el comando:
apio -Un trabajador del proyecto -l info --autoscale = 10,2

A veces todavía se atasca.

@mikolaje lo que me funciona es configurar el tcp-keepalive en redis.conf para resolver el punto muerto.

¿Alguien encontró una buena manera de monitorear la situación de los trabajadores atascados?

Todavía estamos viendo este comportamiento con el código del maestro, no puedo decir si esto ha mejorado o no, pero la sensación inicial es que al menos para nuestro caso de uso, todavía tenemos trabajadores atascados.

La pregunta es ¿cómo monitorear a los trabajadores individuales? En este momento, todo lo que podemos hacer es monitorear el nivel de la cola (mensajes presentes y no recogidos), pero me gustaría rastrear esto mucho antes para ver cuándo un trabajador ya no está trabajando.
Lo que he notado es que el comando de "estado" del apio volverá OK independientemente, también la flor muestra que todo está bien.

Probé todas las correcciones sugeridas, hasta ejecutar Celery y amigos del maestro, pero tan pronto como un trabajador encuentra un ConnectionResetError, se niega a procesar más tareas. Al reiniciar RabbitMQ, los trabajadores atascados vuelven a procesar las tareas. Naturalmente, esta no es una buena solución, pero me indica una dirección para depurar más.

@hedleyroos , ¿también puede tener este problema # 4895? Estamos ejecutando Celery v4.1.x para evitar ese problema.

@kingb sí, tengo ese problema. Quería evitar la degradación a 4.1 y cambié a Redis como corredor, pero aún encontré el mismo problema. Afortunadamente, Redis tiene una opción tcp-keepalive que configuré en 60 segundos. Esto significa que Redis terminará lo que considera conexiones obsoletas, y esto a su vez hace que el trabajador salga del punto muerto y comience a consumir tareas nuevamente.

Tengo congelación para apio == 4.2.1, billar == 3.5.0.5, kombu == 4.2.2.post1, amqp == 2.4.0, Python 3.6.8, RabbitMQ, MongoDB v 4

@tonal Estoy usando las mismas versiones que tú y, a veces,

no sucede todo el tiempo, pero sucede cada pocas horas / cada pocos días y tiene que purgar el mensaje y reiniciar el servicio.

4.3? 4.4rc2?

No, celery==4.2.1, billiard==3.5.0.5, kombu==4.2.2.post1, amqp==2.4.0,

@auvipy todavía lo ve morir ~ 1 hora con 4.3. Si es de alguna ayuda, deja de tener éxito o fallar en las tareas y simplemente las sigue recibiendo y luego simplemente falla silenciosamente, pero el proceso de apio sigue funcionando.

/code # celery status
celery@celery-worker: OK

1 node online.

¿Alguien puede asegurarme que realmente está resuelto en 4.3? Tengo que tomar la decisión de actualizar basándome en esto.

Hola todos,

Esto no se solucionó en 4.3. Estoy usando esta versión en un proyecto, y mis trabajadores todavía se atascan (después de mostrar mensajes de "latidos faltantes").

¿Alguien ha eludido este problema con 4.3? Noté que el código es significativamente diferente de la solución / parche anterior, así que supongo que hubo un intento de solucionar el problema, pero no parece arreglado.

¡Gracias!

Puedo confirmar que tengo el mismo problema en Celery 4.3 (Python 3.7.3).

¿podrías probar el apio == 4.4.0rc3?

@auvipy también lo intentó, pero no ayudó. solo para su información, estoy usando gevent

¿Alguna posibilidad de que este problema se vuelva a abrir? Parece que la gente todavía lo encuentra en las versiones más recientes de apio.

Estoy enfrentando el mismo problema. Python2.7, Apio4.1.0. Yo uso el corredor de SQS. El apio procesa tareas durante algún tiempo y se bloquea abruptamente después de un tiempo.

Estoy enfrentando el mismo problema. Python2.7, Apio4.1.0. Yo uso el corredor de SQS. El apio procesa tareas durante algún tiempo y se bloquea abruptamente después de un tiempo.

debería probar las versiones más recientes.

No hemos podido reproducir este problema, por lo que alguien tendrá que depurarlo por nosotros y, si puede, proporcionar una solución para este problema.

Vemos esto a diario en producción. Nunca logramos recrear en un entorno habilitado para desarrollo o depuración.
¿Alguna sugerencia sobre qué tipo de información podemos proporcionar que pueda ayudar?

Sugiero usar algo como python-hunter y / o python-manhole .
Es probable que el problema esté en el proceso maestro.
Alternativamente, use strace para ver en qué conector / tubería se está leyendo o escribiendo e intente deducir desde allí.
Puedo ir a sus oficinas la semana que viene si lo desea. Discutamos esto por teléfono.

Gracias, intentaré esto, solo como un ejemplo general, así es como se ven los procesos atascados desde el punto de vista de PS.

username 14182     1  1 03:12 ?        00:02:28 [celeryd:  workername<strong i="6">@hostname</strong>:MainProcess] -active- (moshe)
username 15487 14182  0 03:15 ?        00:00:12 [celeryd:  workername<strong i="7">@hostname</strong>:ForkPoolWorker-5]
username 15766 14182  0 03:18 ?        00:00:12 [celeryd:  workername<strong i="8">@hostname</strong>:ForkPoolWorker-14]
username 16666 14182  0 03:52 ?        00:00:10 [celeryd:  workername<strong i="9">@hostname</strong>:ForkPoolWorker-43]
username 20253 14182 65 06:45 ?        00:00:24 [celeryd:  workername<strong i="10">@hostname</strong>:ForkPoolWorker-110]

La concurrencia se establece en 4, con "max-tasks-per-child" establecido, de modo que se reinicia cada X mensajes.
Como se puede ver claramente, mientras 20253 todavía está activo (y el contador de trabajadores sigue aumentando) 15487 murió temprano y está atascado con:

$ strace -p 15487
strace: Process 15487 attached
futex(0x2b36980, FUTEX_WAIT_PRIVATE, 0, NULL

El maestro todavía está activo y todavía se envían mensajes al trabajador que todavía está activo.

Solo dejo esto aquí en caso de que ayude a alguien más. No está claro si tuve exactamente el mismo problema, pero mis trabajadores también estaban colgados después de procesar varios cientos / miles de tareas. Estaba lanzando Celery como un servicio usando:

celery multi start w1 w2 -A app.celery --time-limit=21600 --concurrency=4

Después de solucionar sin éxito que los trabajadores se quedaran atascados con eso, cambié el comando a un simple:

celery worker -A app.celery --time-limit=21600 --concurrency=4

¡Y esto me lo arregló! Cuando ejecuté Celery de esta manera manualmente, pude ver que las tareas entraban y se ejecutaban y las tareas ocasionalmente se atascaban (y luego el trabajador se reiniciaba correctamente). Creo que la forma anterior no era reiniciar correctamente a los trabajadores después de que fallara una tarea. El otro cambio que hice fue establecer un límite de tiempo más bajo en aquellas tareas que se ejecutarían cientos / miles de veces, de modo que cuando un trabajador se atascara inevitablemente, el proceso de trabajo principal podría eliminarlo y reiniciar uno nuevo con bastante rapidez. antes de que todos se quedaran atascados.

No conozco lo suficiente sobre las partes internas del apio para depurar este problema de manera efectiva, pero si alguien puede dar alguna orientación o lugares para buscar, puedo investigarlo por nuestra parte cuando ocurra el problema.

No puedo volver a crear este problema a pedido, pero ocurre con el tiempo en uno de nuestros sitios. Por lo que he podido rastrear, el problema ocurre cuando hay una caída de red entre el trabajador y el servidor RabbitMQ mientras los trabajadores están bajo carga. La falla básicamente bloquea el proceso principal para que no acepte nuevos trabajos.

Si el sistema no está bajo carga, todo sigue funcionando correctamente. Tiene que ser una combinación de carga en el trabajador y una caída de red al mismo tiempo. Esto ha confirmado dos formas, una en un sistema de preparación secundaria sin carga en el momento que ve las fallas de mezcla pero continúa funcionando correctamente después. En segundo lugar, ejecutamos varios trabajadores por caja. Solo los trabajadores que están ejecutando trabajos fallan. Si la carga es ligera y no todos los procesos principales están procesando trabajos, solo algunos de los procesos principales fallarán.

Nuestro comando de lanzamiento:

celery worker -n XXX-%I.${CELERY_HOST_NAME} \
    -A XXX.app --pidfile=/run/celery/XXX-%I.pid \
    --autoscale=10,3 \
    --max-tasks-per-child 1\
            -Q YYYl \
    --loglevel=INFO --logfile=/var/log/XXX/XXX-%I${CELERY_PROCESS_INDEX_SEPERATOR}.log --soft-time-limit=7200 --time-limit

Salida de registro:

{"event": "mingle: sync with 54 nodes", "logger": "celery.worker.consumer.mingle", "level": "info", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "mingle: processing reply from celery@XXXX", "logger": "celery.worker.consumer.mingle", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
...
{"event": "mingle: processing reply from celery@XXX", "logger": "celery.worker.consumer.mingle", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "mingle: sync complete", "logger": "celery.worker.consumer.mingle", "level": "info", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "^-- substep ok", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Consumer: Starting Gossip", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "using channel_id: 2", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Channel open", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:31:24", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "^-- substep ok", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:25", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Consumer: Starting Tasks", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:25", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "^-- substep ok", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:26", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Consumer: Starting Control", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:26", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "using channel_id: 3", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:31:26", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Channel open", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:31:26", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "^-- substep ok", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:27", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Consumer: Starting event loop", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:27", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Worker: Hub.register Autoscaler...", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:27", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "| Worker: Hub.register Pool...", "logger": "celery.bootsteps", "level": "debug", "timestamp": "2019-11-17 22:31:27", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "basic.qos: prefetch_count->3", "logger": "kombu.common", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Received task: XXX[ec8092ac-9b3c-4081-b11e-2d93aeac3bf1]  ", "logger": "celery.worker.strategy", "level": "info", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "TaskPool: Apply <function _fast_trace_task at 0x7fb3d5399bf8> (args:('XXX', 'ec8092ac-9b3c-4081-b11e-2d93aeac3bf1', {'lang': 'py', 'task': 'XXX', 'id': 'ec8092ac-9b3c-4081-b11e-2d93aeac3bf1', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'ec8092ac-9b3c-4081-b11e-2d93aeac3bf1', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': \"xxx}\", 'origin': 'gen49433<strong i="17">@XXX</strong>', 'reply_to':... kwargs:{})", "logger": "celery.pool", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Received task: XXX[4e3acf3a-c0ec-4be3-8add-3e76d047ec09]  ", "logger": "celery.worker.strategy", "level": "info", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "TaskPool: Apply <function _fast_trace_task at 0x7fb3d5399bf8> (args:('XXX', '4e3acf3a-c0ec-4be3-8add-3e76d047ec09', {'lang': 'py', 'task': 'XXX', 'id': '4e3acf3a-c0ec-4be3-8add-3e76d047ec09', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': '4e3acf3a-c0ec-4be3-8add-3e76d047ec09', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': \"{'xxx}\", 'origin': 'gen49438<strong i="18">@XXX</strong>', 'reply_to':... kwargs:{})", "logger": "celery.pool", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Received task: XXX[d5a6ada9-b9f5-435d-84cb-4dd6d7e21020]  ", "logger": "celery.worker.strategy", "level": "info", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "TaskPool: Apply <function _fast_trace_task at 0x7fb3d5399bf8> (args:('XXX', 'd5a6ada9-b9f5-435d-84cb-4dd6d7e21020', {'lang': 'py', 'task': 'XXX', 'id': 'd5a6ada9-b9f5-435d-84cb-4dd6d7e21020', 'shadow': None, 'eta': None, 'expires': None, 'group': None, 'retries': 0, 'timelimit': [None, None], 'root_id': 'd5a6ada9-b9f5-435d-84cb-4dd6d7e21020', 'parent_id': None, 'argsrepr': '()', 'kwargsrepr': \"{xxx}\", 'origin': 'gen49438<strong i="19">@XXX</strong>', 'reply_to':... kwargs:{})", "logger": "celery.pool", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "celery<strong i="20">@XXX</strong> left", "logger": "celery.worker.consumer.gossip", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task accepted: XXX[4e3acf3a-c0ec-4be3-8add-3e76d047ec09] pid:111546", "logger": "celery.worker.request", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task accepted: XXX[ec8092ac-9b3c-4081-b11e-2d93aeac3bf1] pid:111544", "logger": "celery.worker.request", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task accepted: XXX[d5a6ada9-b9f5-435d-84cb-4dd6d7e21020] pid:111538", "logger": "celery.worker.request", "level": "debug", "timestamp": "2019-11-17 22:31:28", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "missed heartbeat from celery@XXX", "logger": "celery.worker.consumer.gossip", "level": "info", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
...
{"event": "missed heartbeat from celery@XXXX", "logger": "celery.worker.consumer.gossip", "level": "info", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task handler raised error: WorkerLostError('Worker exited prematurely: exitcode 155.')", "exc_info": ["<class 'billiard.exceptions.WorkerLostError'>", "WorkerLostError('Worker exited prematurely: exitcode 155.')", "<billiard.einfo.Traceback object at 0x7fb3cec22d68>"], "logger": "celery.worker.request", "level": "error", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task handler raised error: WorkerLostError('Worker exited prematurely: exitcode 155.')", "exc_info": ["<class 'billiard.exceptions.WorkerLostError'>", "WorkerLostError('Worker exited prematurely: exitcode 155.')", "<billiard.einfo.Traceback object at 0x7fb3cec1c5f8>"], "logger": "celery.worker.request", "level": "error", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "Task handler raised error: WorkerLostError('Worker exited prematurely: exitcode 155.')", "exc_info": ["<class 'billiard.exceptions.WorkerLostError'>", "WorkerLostError('Worker exited prematurely: exitcode 155.')", "<billiard.einfo.Traceback object at 0x7fb3cec1c860>"], "logger": "celery.worker.request", "level": "error", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "heartbeat_tick : for connection ebf9e86b69a347aaae81a92712b65eb9", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}
{"event": "heartbeat_tick : Prev sent/recv: None/None, now - 31/490, monotonic - 4090172.991839465, last_heartbeat_sent - 4090172.991832662, heartbeat int. - 60 for connection ebf9e86b69a347aaae81a92712b65eb9", "logger": "amqp", "level": "debug", "timestamp": "2019-11-17 22:34:17", "code_version": "v1.0.0-168-ga02e6b6-dirty"}

Seguimiento de pila py-bt o proceso maestro:

 <built-in method read of module object at remote 0x7fb3e61f33b8>
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py", line 64, in __read__
    chunk = read(fd, size)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py", line 241, in _recv_message
    fd, bufv[Hr:] if readcanbuf else bufv, 4 - Hr,
  <built-in method next of module object at remote 0x7fb3e627ac28>
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py", line 298, in on_result_readable
    next(it)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/hub.py", line 362, in create_loop
    cb(*cbargs)
  <built-in method next of module object at remote 0x7fb3e627ac28>
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py", line 91, in asynloop
    next(loop)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 596, in start
    c.loop(*c.loop_args())
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 318, in start
    blueprint.start(self)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 369, in start
    return self.obj.start()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/worker.py", line 205, in start
    self.blueprint.start(self)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py", line 258, in run
    worker.start()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py", line 252, in __call__
    ret = self.run(*args, **kwargs)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py", line 223, in run_from_argv
    return self(*args, **options)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 420, in execute
    ).run_from_argv(self.prog_name, argv[1:], command=argv[0])
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 488, in handle_argv
    return self.execute(command, argv)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py", line 298, in execute_from_commandline
    return self.handle_argv(self.prog_name, argv[1:])
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 496, in execute_from_commandline
    super(CeleryCommand, self).execute_from_commandline(argv)))
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 322, in main
    cmd.execute_from_commandline(argv)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/__main__.py", line 16, in main
    _main()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery", line 8, in <module>
    sys.exit(main())

bt seguimiento de pila del proceso maestro

#0  0x00007fb3e5c2e6e0 in __read_nocancel () from /lib64/libpthread.so.0
#1  0x0000000000542a85 in _Py_read (fd=fd@entry=56, buf=0x7fb3cecbad70, count=count@entry=4) at Python/fileutils.c:1407
#2  0x000000000054b1a9 in os_read_impl (module=0x0, length=4, fd=56) at ./Modules/posixmodule.c:8043
#3  os_read (module=module@entry=<module at remote 0x7fb3e61f33b8>, args=args@entry=0x7fb3ceb7f3b8, nargs=<optimized out>)
    at ./Modules/clinic/posixmodule.c.h:3625
#4  0x000000000043bddd in _PyMethodDef_RawFastCallKeywords (kwnames=0x0, nargs=0, args=0x7fb3ceb7f3b8,
    self=<module at remote 0x7fb3e61f33b8>, method=0x8c7ec0 <posix_methods+2848>) at Objects/call.c:651
#5  _PyCFunction_FastCallKeywords (func=<built-in method read of module object at remote 0x7fb3e61f33b8>,
    args=args@entry=0x7fb3ceb7f3b8, nargs=nargs@entry=2, kwnames=kwnames@entry=0x0) at Objects/call.c:730
#6  0x00000000004280bd in call_function (kwnames=0x0, oparg=2, pp_stack=<synthetic pointer>) at Python/ceval.c:4568
#7  _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#8  0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3ceb7f218, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py, line 64, in __read__ (fd=56, buf=<_io.BytesIO at remote 0x7fb3ceca0f10>, size=4, read=<built-in method read of module object at remote 0x7fb3e61f33b8>)) at Python/ceval.c:547
#9  _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf537420>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=3, kwnames=0x0, kwargs=kwargs@entry=0x347a2b8, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3cf3074f8, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='__read__',
    qualname='__read__') at Python/ceval.c:3930
#10 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#11 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#12 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#13 0x00000000005c043d in gen_send_ex (closing=0, exc=0, arg=0x0, gen=0x7fb3cebfb138) at Objects/genobject.c:221
#14 gen_iternext (gen=0x7fb3cebfb138) at Objects/genobject.c:542
#15 0x00000000004e703b in builtin_next (self=self@entry=<module at remote 0x7fb3e627ac28>, args=args@entry=0x3481f10,
    nargs=<optimized out>) at Python/bltinmodule.c:1426
#16 0x000000000043bddd in _PyMethodDef_RawFastCallKeywords (kwnames=0x0, nargs=0, args=0x3481f10,
    self=<module at remote 0x7fb3e627ac28>, method=0x8ba020 <builtin_methods+992>) at Objects/call.c:651
#17 _PyCFunction_FastCallKeywords (func=<built-in method next of module object at remote 0x7fb3e627ac28>, args=args@entry=0x3481f10,
    nargs=nargs@entry=1, kwnames=kwnames@entry=0x0) at Objects/call.c:730
#18 0x00000000004280bd in call_function (kwnames=0x0, oparg=1, pp_stack=<synthetic pointer>) at Python/ceval.c:4568
#19 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#20 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x3481d68, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py, line 298, in on_result_readable (fileno=56, it=<generator at remote 0x7fb3cebfb138>))
    at Python/ceval.c:547
#21 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3cf2ed300>,
    globals=globals@entry={'__name__': 'celery.concurrency.asynpool', '__doc__': "Version of multiprocessing.Pool using Async I/O.\n\n.. note::\n\n    This module will be moved soon, so don't use it directly.\n\nThis is a non-blocking version of :class:`multiprocessing.Pool`.\n\nThis code deals with three major challenges:\n\n#. Starting up child processes and keeping them running.\n#. Sending jobs to the processes and receiving results back.\n#. Safely shutting down this system.\n", '__package__': 'celery.concurrency', '__loader__': <SourceFileLoader(name='celery.concurrency.asynpool', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py') at remote 0x7fb3cf2db828>, '__spec__': <ModuleSpec(name='celery.concurrency.asynpool', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py', loader_state=None, submodule_search_locations=N...(truncated),
    locals=locals@entry=0x0, args=args@entry=0x7fb3cec173a8, argcount=argcount@entry=1, kwnames=kwnames@entry=0x0,
    kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0, kwstep=kwstep@entry=2, defs=0x0, defcount=0, kwdefs=0x0,
    closure=(<cell at remote 0x7fb3ceb5f9a8>, <cell at remote 0x7fb3ceb5f9d8>, <cell at remote 0x7fb3ceb5fa08>, <cell at remote 0x7fb3ceb5fa38>, <cell at remote 0x7fb3ceb5fb58>), name='on_result_readable',
    qualname='ResultHandler._make_process_result.<locals>.on_result_readable') at Python/ceval.c:3930
#22 0x000000000043b027 in _PyFunction_FastCallDict (func=<function at remote 0x7fb3cec260d0>, args=args@entry=0x7fb3cec173a8,
    nargs=1, kwargs=kwargs@entry=0x0) at Objects/call.c:376
#23 0x000000000043c713 in PyObject_Call (callable=callable@entry=<function at remote 0x7fb3cec260d0>, args=args@entry=(56,),
    kwargs=kwargs@entry=0x0) at Objects/call.c:226
#24 0x0000000000422479 in do_call_core (kwdict=0x0, callargs=(56,), func=<function at remote 0x7fb3cec260d0>) at Python/ceval.c:4645
#25 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#26 0x00000000005c043d in gen_send_ex (closing=0, exc=0, arg=0x0, gen=0x7fb3cf275de0) at Objects/genobject.c:221
#27 gen_iternext (gen=0x7fb3cf275de0) at Objects/genobject.c:542
#28 0x00000000004e703b in builtin_next (self=self@entry=<module at remote 0x7fb3e627ac28>, args=args@entry=0x347abd0,
    nargs=<optimized out>) at Python/bltinmodule.c:1426
#29 0x000000000043bddd in _PyMethodDef_RawFastCallKeywords (kwnames=0x0, nargs=0, args=0x347abd0,
    self=<module at remote 0x7fb3e627ac28>, method=0x8ba020 <builtin_methods+992>) at Objects/call.c:651
#30 _PyCFunction_FastCallKeywords (func=<built-in method next of module object at remote 0x7fb3e627ac28>, args=args@entry=0x347abd0,
    nargs=nargs@entry=1, kwnames=kwnames@entry=0x0) at Objects/call.c:730
#31 0x00000000004280bd in call_function (kwnames=0x0, oparg=1, pp_stack=<synthetic pointer>) at Python/ceval.c:4568
#32 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#33 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x347a9d8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py, line 91, in asynloop (obj=<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.l...(truncated)) at Python/ceval.c:547
#34 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3cf2c6d20>,
    globals=globals@entry={'__name__': 'celery.worker.loops', '__doc__': 'The consumers highly-optimized inner loop.', '__package__': 'celery.worker', '__loader__': <SourceFileLoader(name='celery.worker.loops', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py') at remote 0x7fb3cf2cd3c8>, '__spec__': <ModuleSpec(name='celery.worker.loops', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/__pycache__/loops.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf2cd400>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py', '__cached_...(truncated),
    locals=locals@entry=0x0, args=args@entry=0x7fb3cebfba38, argcount=argcount@entry=9, kwnames=kwnames@entry=0x0,
    kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0, kwstep=kwstep@entry=2, defs=0x7fb3cf2cd5d8, defcount=1, kwdefs=0x0,
    closure=0x0, name='asynloop', qualname='asynloop') at Python/ceval.c:3930
#35 0x000000000043b027 in _PyFunction_FastCallDict (func=<function at remote 0x7fb3cf2b1c80>, args=args@entry=0x7fb3cebfba38,
    nargs=9, kwargs=kwargs@entry=0x0) at Objects/call.c:376
#36 0x000000000043c713 in PyObject_Call (callable=callable@entry=<function at remote 0x7fb3cf2b1c80>,
    args=args@entry=(<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.Celery---Type <return> to continue, or q <return> to quit---
Config', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker...(truncated),
    kwargs=kwargs@entry=0x0) at Objects/call.c:226
#37 0x0000000000422479 in do_call_core (kwdict=0x0,
    callargs=(<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker...(truncated),
    func=<function at remote 0x7fb3cf2b1c80>) at Python/ceval.c:4645
#38 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#39 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#40 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#41 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#42 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#43 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#44 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#45 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#46 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#47 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#48 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#49 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#50 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#51 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#52 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#53 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#54 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#55 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#56 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#57 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#58 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#59 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#60 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#61 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#62 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#63 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x2c6b3a8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py, line 258, in run (self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at ...(truncated)) at Python/ceval.c:547
#64 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3d89706f0>,
    globals=globals@entry={'__name__': 'celery.bin.worker', '__doc__': "Program used to start a Celery worker instance.\n\nThe :program:`celery worker` command (previously known as ``celeryd``)\n\n.. program:: celery worker\n\n.. seealso::\n\n    See :ref:`preload-options`.\n\n.. cmdoption:: -c, --concurrency\n\n    Number of child processes processing the queue.  The default\n    is the number of CPUs available on your system.\n\n.. cmdoption:: -P, --pool\n\n    Pool implementation:\n\n    prefork (default), eventlet, gevent or solo.\n\n.. cmdoption:: -n, --hostname\n\n    Set custom hostname (e.g., 'w1@%%h').  Expands: %%h (hostname),\n    %%n (name) and %%d, (domain).\n\n.. cmdoption:: -B, --beat\n\n    Also run the `celery beat` periodic task scheduler.  Please note that\n    there must only be one instance of this service.\n\n    .. note::\n\n        ``-B`` is meant to be used for development purposes. For production\n        environment, you need to start :program:`celery beat` separately.\n\n.. cmdoption:: -Q, --queues\n\n    L...(truncated),
    locals=locals@entry=0x0, args=args@entry=0x7ffdbea26e60, argcount=argcount@entry=1, kwnames=kwnames@entry=0x2c6b830,
    kwargs=kwargs@entry=0x2c6b838, kwcount=78, kwcount@entry=39, kwstep=kwstep@entry=2, defs=0x7fb3d89cf8d0, defcount=9, kwdefs=0x0,
    closure=0x0, name='run', qualname='worker.run') at Python/ceval.c:3930
#65 0x000000000043b027 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3d8979620>,
    args=args@entry=0x7ffdbea26e60, nargs=nargs@entry=1,
    kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:376
#66 0x000000000043e749 in _PyObject_FastCallDict (
    kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, nargs=1, args=0x7ffdbea26e60, callable=<function at remote 0x7fb3d8979620>) at Objects/call.c:98
#67 _PyObject_Call_Prepend (callable=<function at remote 0x7fb3d8979620>, obj=<optimized out>, args=(),
    kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'b---Type <return> to continue, or q <return> to quit---
gl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None})
    at Objects/call.c:904
#68 0x000000000043c638 in PyObject_Call (callable=callable@entry=<method at remote 0x7fb3e6238d08>, args=args@entry=(),
    kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:245

#69 0x0000000000422479 in do_call_core (
    kwdict={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, callargs=(), func=<method at remote 0x7fb3e6238d08>) at Python/ceval.c:4645
#70 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191

#71 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3d5414a48, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py, line 252, in __call__ (self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread....(truncated)) at Python/ceval.c:547
#72 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3e6172300>,
    globals=globals@entry={'__name__': 'celery.bin.base', '__doc__': 'Base command-line interface.', '__package__': 'celery.bin', '__loader__': <SourceFileLoader(name='celery.bin.base', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py') at remote 0x7fb3e6154748>, '__spec__': <ModuleSpec(name='celery.bin.base', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/__pycache__/base.cpython-37.pyc', _initializing=False) at remote 0x7fb3e6154780>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py', '__cached__': '/app/aep/.local/share/virtualenvs/a02e6b...(truncated),
    locals=locals@entry=0x0, args=args@entry=0x7ffdbea271d0, argcount=argcount@entry=1, kwnames=kwnames@entry=0x2c69670,
    kwargs=kwargs@entry=0x2c69678, kwcount=78, kwcount@entry=39, kwstep=kwstep@entry=2, defs=0x0, defcount=0, kwdefs=0x0,
    closure=0x0, name='__call__', qualname='Command.__call__') at Python/ceval.c:3930
#73 0x000000000043b027 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3d89a1a60>,

    args=args@entry=0x7ffdbea271d0, nargs=nargs@entry=1,
    kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:376

#74 0x000000000043e749 in _PyObject_FastCallDict (
    kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, nargs=1, args=0x7ffdbea271d0, callable=<function at remote 0x7fb3d89a1a60>) at Objects/call.c:98

#75 _PyObject_Call_Prepend (callable=callable@entry=<function at remote 0x7fb3d89a1a60>,
    obj=obj@entry=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated),
    args=args@entry=(),
    kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:904

#76 0x0000000000494665 in slot_tp_call (
    self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_
worker.ta...(truncated), args=(),
    kwds={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': N---Type <return> to continue, or q <return> to quit---
one, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None})
    at Objects/typeobject.c:6379

#77 0x000000000043c638 in PyObject_Call (
    callable=callable@entry=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated),
    args=args@entry=(),
    kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:245

#78 0x0000000000422479 in do_call_core (

    kwdict={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, callargs=(),
    func=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated)) at Python/ceval.c:4645
#79 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#80 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3d7a13808, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py, line 223, in run_from_argv (prog_name='celery', argv=['-n', 'ae-worker-3.%h', '-A', 'ae_worker.app', '--pidfile=/run/celery/ae-worker-3.pid', '--autoscale=10,3', '--max-tasks-per-child', '1', '-Q', 'bgl', '--loglevel=INFO', '--logfile=/var/log/ae-worker/ae-worker-3%I.log', '--soft-time-limit=7200', '--time-limit=7300', '-Ofair', '--max-tasks-per-child', '1', '-E'], command='worker', options={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child':...(truncated)) at Python/ceval.c:547
#81 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8970300>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=3, kwnames=0x7fb3d8a12f78, kwargs=kwargs@entry=0x7fb3d54143f0, kwcount=1, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3d8973660, defcount=defcount@entry=2, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
    name='run_from_argv', qualname='worker.run_from_argv') at Python/ceval.c:3930
#82 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#83 0x0000000000428c25 in call_function (kwnames=('command',), oparg=<optimized out>, pp_stack=<synthetic pointer>)
    at Python/ceval.c:4616
#84 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3139

#85 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3d5414248, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 420, in execute (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<...(truncated)) at Python/ceval.c:547
#86 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d89f7540>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=3, kwnames=0x0, kwargs=kwargs@entry=0x2957570, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3d89a54f8, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='execute',
    qualname='CeleryCommand.execute') at Python/ceval.c:3930
#87 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#88 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#89 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#90 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x29573b8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 488, in handle_argv (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_...(truncated)) at Python/ceval.c:547
#91 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8a045d0>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=3, kwnames=0x0, kwargs=kwargs@entry=0x7fb3d897c1f0, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x0, defcount=defcount@entry=0, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='handle_argv',
    qualname='CeleryCommand.handle_argv') at Python/ceval.c:3930
#92 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#93 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#94 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#95 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3d897c048, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py, line 298, in execute_from_commandline (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'pr---Type <return> to continue, or q <return> to quit---
eload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _f...(truncated)) at Python/ceval.c:547
#96 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3e6172420>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x7fb3e60f0f18, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3e6174bc0, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
    name='execute_from_commandline', qualname='Command.execute_from_commandline') at Python/ceval.c:3930
#97 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#98 0x0000000000428b80 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#99 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3093
#100 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3e60f0d68, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 496, in execute_from_commandline (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, ...(truncated)) at Python/ceval.c:547
#101 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8a06780>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x274a150, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3d89a55a0, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0,
    closure=closure@entry=(<cell at remote 0x7fb3d895e5e8>,), name='execute_from_commandline',
    qualname='CeleryCommand.execute_from_commandline') at Python/ceval.c:3930
#102 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#103 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#104 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#105 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#106 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#107 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#108 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#109 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=0, globals=<optimized out>)
    at Objects/call.c:283
#110 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#111 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#112 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#113 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x26c8708, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery, line 8, in <module> ()) at Python/ceval.c:547
#114 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3de9fb300>,
    globals=globals@entry=<unknown at remote 0x7fb3de9e4648>,
    locals=locals@entry='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery',
    args=args@entry=0x0, argcount=argcount@entry=0, kwnames=kwnames@entry=0x0, kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0,
    kwstep=kwstep@entry=2, defs=defs@entry=0x0, defcount=defcount@entry=0, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
    name=name@entry=0x0, qualname=qualname@entry=0x0) at Python/ceval.c:3930
#115 0x00000000004ebbc0 in PyEval_EvalCodeEx (closure=0x0, kwdefs=0x0, defcount=0, defs=0x0, kwcount=0, kws=0x0, argcount=0,
    args=0x0,
    locals=locals@entry='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery',
    globals=globals@entry=<unknown at remote 0x7fb3de9e4648>, _co=_co@entry=<code at remote 0x7fb3de9fb300>) at Python/ceval.c:3959
#116 PyEval_EvalCode (co=co@entry=<code at remote 0x7fb3de9fb300>,
    globals=globals@entry={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
    locals=locals@entry={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>}) at Python/ceval.c:524
#117 0x00000000005265f8 in run_mod (arena=0x7fb3e6282078, flags=0x7ffdbea285f0,
    locals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
    globals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
    filename='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', mod=0x26c1cc8)
    at Python/pythonrun.c:1035
#118 PyRun_FileExFlags (fp=0x26f05c0, filename_str=<optimized out>, start=<optimized out>,
    globals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
    locals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>}, closeit=1, flags=0x7ffdbea285f0) at Python/pythonrun.c:988
#119 0x00000000005267dd in PyRun_SimpleFileExFlags (fp=0x26f05c0, filename=<optimized out>, closeit=1, flags=0x7ffdbea285f0)
    at Python/pythonrun.c:429
#120 0x000000000042fdfd in pymain_run_file (p_cf=0x7ffdbea285f0,
    filename=0x2655d10 L"/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery",
    fp=0x26f05c0) at Modules/main.c:427
#121 pymain_run_filename (cf=0x7ffdbea285f0, pymain=0x7ffdbea286d0) at Modules/main.c:1627
#122 pymain_run_python (pymain=0x7ffdbea286d0) at Modules/main.c:2877
---Type <return> to continue, or q <return> to quit---
#123 pymain_main (pymain=pymain@entry=0x7ffdbea286d0) at Modules/main.c:3038
#124 0x00000000004300cd in _Py_UnixMain (argc=<optimized out>, argv=<optimized out>) at Modules/main.c:3073
#125 0x00007fb3e516c495 in __libc_start_main () from /lib64/libc.so.6
#126 0x0000000000429df4 in _start ()

Seguimiento de pila py-bt o proceso bifurcado:

 Traceback (most recent call first):
  <built-in method read of module object at remote 0x7fb3e61f33b8>
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py", line 422, in _recv
    chunk = read(handle, remaining)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py", line 456, in _recv_bytes
    buf = self._recv(4)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py", line 243, in recv_bytes
    buf = self._recv_bytes(maxlength)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/queues.py", line 355, in get_payload
    return self._reader.recv_bytes()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 445, in _recv
    return True, loads(get_payload())
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 473, in receive
    ready, req = _receive(1.0)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 351, in workloop
    req = wait_for_job()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 292, in __call__
    sys.exit(self.workloop(pid=pid))
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/process.py", line 327, in _bootstrap
    self.run()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/popen_fork.py", line 79, in _launch
    code = process_obj._bootstrap()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/popen_fork.py", line 24, in __init__
    self._launch(process_obj)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/context.py", line 333, in _Popen
    return Popen(process_obj)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/process.py", line 124, in start
    self._popen = self._Popen(self)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 1158, in _create_worker_process
    w.start()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py", line 445, in _create_worker_process
    return super(AsynPool, self)._create_worker_process(i)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 1328, in _repopulate_pool
    self._create_worker_process(self._avail_index())
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 1343, in _maintain_pool
    self._repopulate_pool(joined)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py", line 1351, in maintain_pool
    self._maintain_pool()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py", line 130, in _reschedules
    return fun(*args, **kwargs)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py", line 68, in __call__
    return self.fun(*self.args, **self.kwargs)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/hub.py", line 145, in fire_timers
    entry()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/hub.py", line 301, in create_loop
    poll_timeout = fire_timers(propagate=propagate) if scheduled else 1
  <built-in method next of module object at remote 0x7fb3e627ac28>
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py", line 91, in asynloop
    next(loop)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 596, in start
    c.loop(*c.loop_args())
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/consumer/consumer.py", line 318, in start
    blueprint.start(self)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 369, in start
    return self.obj.start()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/worker.py", line 205, in start
    self.blueprint.start(self)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py", line 258, in run
    worker.start()
---Type <return> to continue, or q <return> to quit---
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py", line 252, in __call__
    ret = self.run(*args, **kwargs)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py", line 223, in run_from_argv
    return self(*args, **options)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 420, in execute
    ).run_from_argv(self.prog_name, argv[1:], command=argv[0])
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 488, in handle_argv
    return self.execute(command, argv)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py", line 298, in execute_from_commandline
    return self.handle_argv(self.prog_name, argv[1:])
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 496, in execute_from_commandline
    super(CeleryCommand, self).execute_from_commandline(argv)))
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py", line 322, in main
    cmd.execute_from_commandline(argv)
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/__main__.py", line 16, in main
    _main()
  File "/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery", line 8, in <module>
    sys.exit(main())

rastro de pila bt del proceso bifurcado

#0  0x00007fb3e5c2e6e0 in __read_nocancel () from /lib64/libpthread.so.0
#1  0x0000000000542a85 in _Py_read (fd=fd@entry=12, buf=0x7fb3cf30b410, count=count@entry=4) at Python/fileutils.c:1407
#2  0x000000000054b1a9 in os_read_impl (module=0x0, length=4, fd=12) at ./Modules/posixmodule.c:8043
#3  os_read (module=module@entry=<module at remote 0x7fb3e61f33b8>, args=args@entry=0x3873df0, nargs=<optimized out>)
    at ./Modules/clinic/posixmodule.c.h:3625
#4  0x000000000043bddd in _PyMethodDef_RawFastCallKeywords (kwnames=0x0, nargs=0, args=0x3873df0,
    self=<module at remote 0x7fb3e61f33b8>, method=0x8c7ec0 <posix_methods+2848>) at Objects/call.c:651
#5  _PyCFunction_FastCallKeywords (func=<built-in method read of module object at remote 0x7fb3e61f33b8>, args=args@entry=0x3873df0,
    nargs=nargs@entry=2, kwnames=kwnames@entry=0x0) at Objects/call.c:730
#6  0x00000000004280bd in call_function (kwnames=0x0, oparg=2, pp_stack=<synthetic pointer>) at Python/ceval.c:4568
#7  _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#8  0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x3873c38, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py, line 422, in _recv (self=<Connection(_handle=12, _readable=True, _writable=False) at remote 0x7fb3ced17fd0>, size=4, read=<built-in method read of module object at remote 0x7fb3e61f33b8>, buf=<_io.BytesIO at remote 0x7fb3cecccba0>, handle=12, remaining=4)) at Python/ceval.c:547
#9  _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf53ff60>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x7fb3cec4f1e8, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3cf54e488, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='_recv',
    qualname='Connection._recv') at Python/ceval.c:3930
#10 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#11 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#12 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#13 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3cec4f048, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py, line 456, in _recv_bytes (self=<Connection(_handle=12, _readable=True, _writable=False) at remote 0x7fb3ced17fd0>, maxsize=None)) at Python/ceval.c:547
#14 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf5420c0>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x7fb3cec87a78, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3cf543108, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
    name='_recv_bytes', qualname='Connection._recv_bytes') at Python/ceval.c:3930
#15 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#16 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#17 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#18 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3cec878e0, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/connection.py, line 243, in recv_bytes (self=<Connection(_handle=12, _readable=True, _writable=False) at remote 0x7fb3ced17fd0>, maxlength=None)) at Python/ceval.c:547
#19 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf53f300>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=1, kwnames=0x0, kwargs=kwargs@entry=0x7fb3cec9e9e8, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3cf538bf8, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
    name='recv_bytes', qualname='_ConnectionBase.recv_bytes') at Python/ceval.c:3930
#20 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#21 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#22 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#23 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#24 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#25 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#26 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#27 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3cec87728, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py, line 445, in _recv (timeout=<float at remote 0x7fb3d01417c8>, loads=<function at remote 0x7fb3cf5d72f0>)) at Python/ceval.c:547
#28 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf598780>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=1, kwnames=0x0, kwargs=kwargs@entry=0x3346fa0, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3ced2c878, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0,
    closure=closure@entry=(<cell at remote 0x7fb3cec534f8>,), name='_recv', qualname='Worker._make_recv_method.<locals>._recv')
    at Python/ceval.c:3930
#29 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#30 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#31 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#32 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x3346df8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py, line 473, in receive (debug=<function at remote 0x7fb3d970b0d0>)) at Python/ceval.c:547
#33 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf598a50>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=0, kwnames=0x0, kwargs=kwargs@entry=0x386d2c0, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3cec222c8, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0,
    closure=closure@entry=(<cell at remote 0x7fb3ceb62708>, <cell at remote 0x7fb3ceb62fd8>), name='receive',
    qualname='Worker._make_protected_receive.<locals>.receive') at Python/ceval.c:3930
#34 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#35 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#36 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#37 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x386d058, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py, line 351, in workloop (debug=<function at remote 0x7fb3d970b0d0>, now=<built-in method monotonic of module object at remote 0x7fb3de980d68>, pid=111793, put=<method at remote 0x7fb3cecf55c8>, inqW_fd=13, synqW_fd=None, maxtasks=1, max_memory_per_child=0, prepare_result=<method at remote 0x7fb3cec86548>, wait_for_job=<function at remote 0x7fb3cec7b268>, wait_for_syn=<function at remote 0x7fb3cec3ff28>, completed=0)) at Python/ceval.c:547
#38 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf5985d0>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=1, kwnames=0x7fb3cf599060, kwargs=kwargs@entry=0x3490068, kwcount=1, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3cf536f90, defcount=defcount@entry=3, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='workloop',
    qualname='Worker.workloop') at Python/ceval.c:3930
#39 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#40 0x0000000000428c25 in call_function (kwnames=('pid',), oparg=<optimized out>, pp_stack=<synthetic pointer>)
    at Python/ceval.c:4616
#41 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3139
#42 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x348feb8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/si---Type <return> to continue, or q <return> to quit---
te-packages/billiard/pool.py, line 292, in __call__ (self=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898...(truncated)) at Python/ceval.c:547
#43 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3cf598270>,
    globals=globals@entry={'__name__': 'billiard.pool', '__doc__': None, '__package__': 'billiard', '__loader__': <SourceFileLoader(name='billiard.pool', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py') at remote 0x7fb3cf5916d8>, '__spec__': <ModuleSpec(name='billiard.pool', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/__pycache__/pool.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf591710>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py', '__cached__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVE...(truncated),
    locals=locals@entry=0x0, args=args@entry=0x7ffdbea23e20, argcount=argcount@entry=1, kwnames=kwnames@entry=0x0,
    kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0, kwstep=kwstep@entry=2, defs=0x0, defcount=0, kwdefs=0x0, closure=0x0,
    name='__call__', qualname='Worker.__call__') at Python/ceval.c:3930
#44 0x000000000043b027 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3cf53db70>,
    args=args@entry=0x7ffdbea23e20, nargs=nargs@entry=1, kwargs=kwargs@entry={}) at Objects/call.c:376
#45 0x000000000043e749 in _PyObject_FastCallDict (kwargs={}, nargs=1, args=0x7ffdbea23e20,
    callable=<function at remote 0x7fb3cf53db70>) at Objects/call.c:98
#46 _PyObject_Call_Prepend (callable=callable@entry=<function at remote 0x7fb3cf53db70>,
    obj=obj@entry=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _us...(truncated),
    args=args@entry=(), kwargs=kwargs@entry={}) at Objects/call.c:904
#47 0x0000000000494665 in slot_tp_call (
    self=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _us...(truncated), args=(), kwds={})
    at Objects/typeobject.c:6379
#48 0x000000000043c638 in PyObject_Call (
    callable=callable@entry=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _us...(truncated),
    args=args@entry=(), kwargs=kwargs@entry={}) at Objects/call.c:245
#49 0x0000000000422479 in do_call_core (kwdict={}, callargs=(),
    func=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _us...(truncated)) at Python/ceval.c:4645
#50 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#51 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#52 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#53 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#54 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#55 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#56 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#57 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#58 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#59 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#60 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#61 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#62 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#63 0x00000000004200a8 in function_code_fastcall (co=co@entry=0x7fb3cf1f5b70, args=<optimized out>, args@entry=0x7ffdbea24610,
    nargs=nargs@entry=2,
    globals=globals@entry={'__name__': 'billiard.popen_fork', '__doc__': None, '__package__': 'billiard', '__loader__': <SourceFileLoader(name='billiard.popen_fork', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/popen_fork.py') at remote 0x7fb3cf201748>, '__spec__': <ModuleSpec(name='billiard.popen_fork', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/popen_fork.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/__pycache__/popen_fork.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf201780>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/popen_fork.py', '__cached__': '/app/aep/.local/share/virtualenvs/a02e6b...(truncated))
    at Objects/call.c:283
#64 0x000000000043b176 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3cf205c80>,
    args=args@entry=0x7ffdbea24610, nargs=nargs@entry=2, kwargs=kwargs@entry=0x0) at Objects/call.c:322
#65 0x000000000043e749 in _PyObject_FastCallDict (kwargs=0x0, nargs=2, args=0x7ffdbea24610,
    callable=<function at remote 0x7fb3cf205c80>) at Objects/call.c:98
---Type <return> to continue, or q <return> to quit---
#66 _PyObject_Call_Prepend (callable=callable@entry=<function at remote 0x7fb3cf205c80>,
    obj=obj@entry=<Popen(returncode=None, pid=0) at remote 0x7fb3cec22da0>,
    args=args@entry=(<ForkProcess(_identity=(3733,), _config={'authkey': <AuthenticationString at remote 0x7fb3ddedfba8>, 'semprefix': '/mp', 'daemon': True}, _parent_pid=94981, _popen=None, _target=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, ...(truncated),
    kwargs=kwargs@entry=0x0) at Objects/call.c:904
#67 0x000000000048f059 in slot_tp_init (self=<Popen(returncode=None, pid=0) at remote 0x7fb3cec22da0>,
    args=(<ForkProcess(_identity=(3733,), _config={'authkey': <AuthenticationString at remote 0x7fb3ddedfba8>, 'semprefix': '/mp', 'daemon': True}, _parent_pid=94981, _popen=None, _target=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, ...(truncated), kwds=0x0)
    at Objects/typeobject.c:6613
#68 0x000000000048a332 in type_call (type=<optimized out>, type@entry=0x336fdb8,
    args=args@entry=(<ForkProcess(_identity=(3733,), _config={'authkey': <AuthenticationString at remote 0x7fb3ddedfba8>, 'semprefix': '/mp', 'daemon': True}, _parent_pid=94981, _popen=None, _target=<Worker(initializer=<function at remote 0x7fb3cf5752f0>, initargs=(<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, ...(truncated),
    kwds=kwds@entry=0x0) at Objects/typeobject.c:949
#69 0x000000000043b861 in _PyObject_FastCallKeywords (callable=callable@entry=<type at remote 0x336fdb8>, stack=<optimized out>,
    nargs=nargs@entry=1, kwnames=kwnames@entry=0x0) at Objects/call.c:199
#70 0x0000000000424098 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4619
#71 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#72 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#73 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#74 0x0000000000428b80 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#75 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3093
#76 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#77 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#78 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#79 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#80 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#81 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#82 0x0000000000428b80 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#83 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3093
#84 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3cf1f0570, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/concurrency/asynpool.py, line 445, in _create_worker_process (self=<AsynPool(sched_strategy=4, synack=False, _queues={(<_SimpleQueue(_reader=<Connection(_handle=31, _readable=True, _writable=False) at remote 0x7fb3cecdecf8>, _writer=<Connection(_handle=32, _readable=False, _writable=True) at remote 0x7fb3cecde2b0>, _poll=<method at remote 0x7fb3cf23cec8>, _rlock=None, _wlock=None) at remote 0x7fb3ced47748>, <_SimpleQueue(_reader=<Connection(_handle=33, _readable=True, _writable=False) at remote 0x7fb3cecde2e8>, _writer=<Connection(_handle=34, _readable=False, _writable=True) at remote 0x7fb3cecde828>, _poll=<method at remote 0x7fb3ceb4a188>, _rlock=None, _wlock=None) at remote 0x7fb3cecde160>, None): <ForkProcess(_identity=(3731,), _config={'authkey': <AuthenticationString at remote 0x7fb3ddedfba8>, 'semprefix': '/mp', 'daemon': True}, _parent_pid=94981, _pope...(truncated)) at Python/ceval.c:547
#85 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf2ed8a0>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x7fb3ced34988, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x0, defcount=defcount@entry=0, kwdefs=kwdefs@entry=0x0,
    closure=closure@entry=(<cell at remote 0x7fb3cf2e5eb8>,), name='_create_worker_process',
    qualname='AsynPool._create_worker_process') at Python/ceval.c:3930
#86 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#87 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#88 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#89 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#90 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#91 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#92 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#93 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#94 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#95 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#96 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#97 0x00000000004200a8 in function_code_fastcall (co=co@entry=0x7fb3cf5299c0, args=<optimized out>, args@entry=0x7ffdbea25400,
    nargs=nargs@entry=1,
    globals=globals@entry={'__name__': 'billiard.pool', '__doc__': None, '__package__': 'billiard', '__loader__': <SourceFileLoader(name='billiard.pool', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py') at remote 0x7fb3cf5916d8>, '__spec__': <ModuleSpec(name='billiard.pool', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/__pycache__/pool.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf591710>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/billiard/pool.py', '__cached__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVE...(truncated))
    at Objects/call.c:283
#98 0x000000000043b176 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3cf2d61e0>,
    args=args@entry=0x7ffdbea25400, nargs=nargs@entry=1, kwargs=kwargs@entry={}) at Objects/call.c:322
---Type <return> to continue, or q <return> to quit---
#99 0x000000000043e749 in _PyObject_FastCallDict (kwargs={}, nargs=1, args=0x7ffdbea25400,
    callable=<function at remote 0x7fb3cf2d61e0>) at Objects/call.c:98
#100 _PyObject_Call_Prepend (callable=<function at remote 0x7fb3cf2d61e0>, obj=<optimized out>, args=(), kwargs={})
    at Objects/call.c:904
#101 0x000000000043c638 in PyObject_Call (callable=callable@entry=<method at remote 0x7fb3cecd5048>, args=args@entry=(),
    kwargs=kwargs@entry={}) at Objects/call.c:245
#102 0x0000000000422479 in do_call_core (kwdict={}, callargs=(), func=<method at remote 0x7fb3cecd5048>) at Python/ceval.c:4645
#103 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#104 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x347e4f8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py, line 130, in _reschedules (args=(), kwargs={}, last=None, now=<float at remote 0x7fb3ceb52108>, lsince=<float at remote 0x7fb3d01419f0>)) at Python/ceval.c:547
#105 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3cf2f05d0>,
    globals=globals@entry={'__name__': 'kombu.asynchronous.timer', '__doc__': 'Timer scheduling Python callbacks.', '__package__': 'kombu.asynchronous', '__loader__': <SourceFileLoader(name='kombu.asynchronous.timer', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py') at remote 0x7fb3cf2e3c50>, '__spec__': <ModuleSpec(name='kombu.asynchronous.timer', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/__pycache__/timer.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf2e3c88>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/as...(truncated),
    locals=locals@entry=0x0, args=args@entry=0x7fb3e6246060, argcount=argcount@entry=0, kwnames=kwnames@entry=0x0,
    kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0, kwstep=kwstep@entry=2, defs=0x0, defcount=0, kwdefs=0x0,
    closure=(<cell at remote 0x7fb3cec65be8>, <cell at remote 0x7fb3cec30d08>, <cell at remote 0x7fb3cec30d38>, <cell at remote 0x7fb3cec30d98>, <cell at remote 0x7fb3cec30e28>), name='maintain_pool', qualname='Pool.maintain_pool') at Python/ceval.c:3930
#106 0x000000000043b027 in _PyFunction_FastCallDict (func=<function at remote 0x7fb3cec6a620>, args=args@entry=0x7fb3e6246060,
    nargs=0, kwargs=kwargs@entry={}) at Objects/call.c:376
#107 0x000000000043c713 in PyObject_Call (callable=callable@entry=<function at remote 0x7fb3cec6a620>, args=args@entry=(),
    kwargs=kwargs@entry={}) at Objects/call.c:226
#108 0x0000000000422479 in do_call_core (kwdict={}, callargs=(), func=<function at remote 0x7fb3cec6a620>) at Python/ceval.c:4645
#109 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#110 0x00000000004200a8 in function_code_fastcall (co=co@entry=0x7fb3cf2e69c0, args=<optimized out>, args@entry=0x7ffdbea259c0,
    nargs=nargs@entry=1,
    globals=globals@entry={'__name__': 'kombu.asynchronous.timer', '__doc__': 'Timer scheduling Python callbacks.', '__package__': 'kombu.asynchronous', '__loader__': <SourceFileLoader(name='kombu.asynchronous.timer', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py') at remote 0x7fb3cf2e3c50>, '__spec__': <ModuleSpec(name='kombu.asynchronous.timer', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/timer.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/__pycache__/timer.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf2e3c88>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/as...(truncated))
    at Objects/call.c:283
#111 0x000000000043b176 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3cf2ee730>,
    args=args@entry=0x7ffdbea259c0, nargs=nargs@entry=1, kwargs=kwargs@entry=0x0) at Objects/call.c:322
#112 0x000000000043e749 in _PyObject_FastCallDict (kwargs=0x0, nargs=1, args=0x7ffdbea259c0,
    callable=<function at remote 0x7fb3cf2ee730>) at Objects/call.c:98
#113 _PyObject_Call_Prepend (callable=callable@entry=<function at remote 0x7fb3cf2ee730>,
    obj=obj@entry=<Entry at remote 0x7fb3cebb3768>, args=args@entry=(), kwargs=kwargs@entry=0x0) at Objects/call.c:904
#114 0x0000000000494665 in slot_tp_call (self=self@entry=<Entry at remote 0x7fb3cebb3768>, args=args@entry=(), kwds=kwds@entry=0x0)
    at Objects/typeobject.c:6379
#115 0x000000000043b861 in _PyObject_FastCallKeywords (callable=callable@entry=<Entry at remote 0x7fb3cebb3768>,
    stack=<optimized out>, nargs=nargs@entry=0, kwnames=kwnames@entry=0x0) at Objects/call.c:199
#116 0x0000000000424098 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4619
#117 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#118 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x3480618, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/kombu/asynchronous/hub.py, line 145, in fire_timers (self=<Hub(timer=<Timer(max_interval=<float at remote 0x7fb3d0141a08>, on_error=None, _queue=[<scheduled at remote 0x7fb3cebf0d18>, <scheduled at remote 0x7fb3cebf0a48>, <scheduled at remote 0x7fb3cecea3b8>, <scheduled at remote 0x7fb3ceceabd8>, <scheduled at remote 0x7fb3ceb60368>, <scheduled at remote 0x7fb3cec05d18>, <scheduled at remote 0x7fb3ceb606d8>, <scheduled at remote 0x7fb3cec8f7c8>, <scheduled at remote 0x7fb3cec8f688>]) at remote 0x7fb3cf2b04e0>, readers={71: (<method at remote 0x7fb3cec80b48>, (<Connection(_connection_id='ed843206bfeb424292688d00071ac41e', authentication=(<AMQPLAIN(username='aep', password='F8P7uYpFbxyWQSJF4psFc52P6') at remote 0x7fb3ceb98a20>,), client_properties={'product': 'py-amqp', 'product_version': '2.5.1', 'capabilities': {'consumer_cancel_notify': True, 'connection.blocked': True, ...(truncated)) at Python/ceval.c:547
#119 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3cf550780>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=1, kwnames=0x7fb3cf2dfd80, kwargs=kwargs@entry=0x347c960, kwcount=1, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3cf549ab0, defcount=defcount@entry=4, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
    name='fire_timers', qualname='Hub.fire_timers') at Python/ceval.c:3930
#120 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#121 0x0000000000428c25 in call_function (kwnames=('propagate',), oparg=<optimized out>, pp_stack=<synthetic pointer>)
    at Python/ceval.c:4616
#122 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3139
#123 0x00000000005c043d in gen_send_ex (closing=0, exc=0, arg=0x0, gen=0x7fb3cf275de0) at Objects/genobject.c:221
#124 gen_iternext (gen=0x7fb3cf275de0) at Objects/genobject.c:542
#125 0x00000000004e703b in builtin_next (self=self@entry=<module at remote 0x7fb3e627ac28>, args=args@entry=0x347abd0,
    nargs=<optimized out>) at Python/bltinmodule.c:1426
#126 0x000000000043bddd in _PyMethodDef_RawFastCallKeywords (kwnames=0x0, nargs=0, args=0x347abd0,
    self=<module at remote 0x7fb3e627ac28>, method=0x8ba020 <builtin_methods+992>) at Objects/call.c:651
#127 _PyCFunction_FastCallKeywords (func=<built-in method next of module object at remote 0x7fb3e627ac28>,
    args=args@entry=0x347abd0, nargs=nargs@entry=1, kwnames=kwnames@entry=0x0) at Objects/call.c:730
#128 0x00000000004280bd in call_function (kwnames=0x0, oparg=1, pp_stack=<synthetic pointer>) at Python/ceval.c:4568
#129 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#130 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x347a9d8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py, line 91, in asynloop (obj=<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.l...(truncated)) at Python/ceval.c:547
#131 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3cf2c6d20>,
    globals=globals@entry={'__name__': 'celery.worker.loops', '__doc__': 'The consumers highly-optimized inner loop.', '__package__': 'celery.worker', '__loader__': <SourceFileLoader(name='celery.worker.loops', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py') at remote 0x7fb3cf2cd3c8>, '__spec__': <ModuleSpec(name='celery.worker.loops', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-em---Type <return> to continue, or q <return> to quit---
IVEOn1/lib/python3.7/site-packages/celery/worker/loops.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/__pycache__/loops.cpython-37.pyc', _initializing=False) at remote 0x7fb3cf2cd400>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/worker/loops.py', '__cached_...(truncated),
    locals=locals@entry=0x0, args=args@entry=0x7fb3cebfba38, argcount=argcount@entry=9, kwnames=kwnames@entry=0x0,
    kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0, kwstep=kwstep@entry=2, defs=0x7fb3cf2cd5d8, defcount=1, kwdefs=0x0,
    closure=0x0, name='asynloop', qualname='asynloop') at Python/ceval.c:3930
#132 0x000000000043b027 in _PyFunction_FastCallDict (func=<function at remote 0x7fb3cf2b1c80>, args=args@entry=0x7fb3cebfba38,
    nargs=9, kwargs=kwargs@entry=0x0) at Objects/call.c:376
#133 0x000000000043c713 in PyObject_Call (callable=callable@entry=<function at remote 0x7fb3cf2b1c80>,
    args=args@entry=(<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker...(truncated),
    kwargs=kwargs@entry=0x0) at Objects/call.c:226
#134 0x0000000000422479 in do_call_core (kwdict=0x0,
    callargs=(<Consumer(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker...(truncated),
    func=<function at remote 0x7fb3cf2b1c80>) at Python/ceval.c:4645
#135 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#136 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#137 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#138 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#139 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#140 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#141 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#142 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#143 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#144 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#145 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#146 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#147 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#148 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#149 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#150 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#151 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#152 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=2, globals=<optimized out>)
    at Objects/call.c:283
#153 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#154 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#155 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#156 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#157 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#158 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#159 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#160 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x2c6b3a8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py, line 258, in run (self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at ...(truncated)) at Python/ceval.c:547
#161 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3d89706f0>,
    globals=globals@entry={'__name__': 'celery.bin.worker', '__doc__': "Program used to start a Celery worker instance.\n\nThe :program:`celery worker` command (previously known as ``celeryd``)\n\n.. program:: celery worker\n\n.. seealso::\n\n    See :ref:`preload-options`.\n\n.. cmdoption:: -c, --concurrency\n\n    Number of child processes processing the queue.  The default\n    is the number of CPUs available on your system.\n\n.. cmdoption:: -P, --pool\n\n    Pool implementation:\n\n    prefork (default), eventlet, gevent or solo.\n\n.. cmdoption:: -n, --hostname\n\n    Set custom hostname (e.g., 'w1@%%h').  Expands: %%h (hostname),\n    %%n (name) and %%d, (domain).\n\n.. cmdoption:: -B, --beat\n\n    Also run the `celery beat` periodic task scheduler.  Please note that\n    there must only be one instance of this service.\n\n    .. note::\n\n        ``-B`` is meant to be used for development purposes. For production\n        environment, you need to start :program:`celery beat` separately.\n\n.. cmdoption:: -Q, --queues\n\n    L...(truncated),
    locals=locals@entry=0x0, args=args@entry=0x7ffdbea26e60, argcount=argcount@entry=1, kwnames=kwnames@entry=0x2c6b830,
    kwargs=kwargs@entry=0x2c6b838, kwcount=78, kwcount@entry=39, kwstep=kwstep@entry=2, defs=0x7fb3d89cf8d0, defcount=9, kwdefs=0x0,
    closure=0x0, name='run', qualname='worker.run') at Python/ceval.c:3930
#162 0x000000000043b027 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3d8979620>,
    args=args@entry=0x7ffdbea26e60, nargs=nargs@entry=1,
    kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'schedu---Type <return> to continue, or q <return> to quit---
ler': None}) at Objects/call.c:376
#163 0x000000000043e749 in _PyObject_FastCallDict (
    kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, nargs=1, args=0x7ffdbea26e60, callable=<function at remote 0x7fb3d8979620>) at Objects/call.c:98
#164 _PyObject_Call_Prepend (callable=<function at remote 0x7fb3d8979620>, obj=<optimized out>, args=(),
    kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None})
    at Objects/call.c:904
#165 0x000000000043c638 in PyObject_Call (callable=callable@entry=<method at remote 0x7fb3e6238d08>, args=args@entry=(),
    kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:245
#166 0x0000000000422479 in do_call_core (
    kwdict={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, callargs=(), func=<method at remote 0x7fb3e6238d08>) at Python/ceval.c:4645
#167 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#168 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3d5414a48, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py, line 252, in __call__ (self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread....(truncated)) at Python/ceval.c:547
#169 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3e6172300>,
    globals=globals@entry={'__name__': 'celery.bin.base', '__doc__': 'Base command-line interface.', '__package__': 'celery.bin', '__loader__': <SourceFileLoader(name='celery.bin.base', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py') at remote 0x7fb3e6154748>, '__spec__': <ModuleSpec(name='celery.bin.base', loader=<...>, origin='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py', loader_state=None, submodule_search_locations=None, _set_fileattr=True, _cached='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/__pycache__/base.cpython-37.pyc', _initializing=False) at remote 0x7fb3e6154780>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py', '__cached__': '/app/aep/.local/share/virtualenvs/a02e6b...(truncated),
    locals=locals@entry=0x0, args=args@entry=0x7ffdbea271d0, argcount=argcount@entry=1, kwnames=kwnames@entry=0x2c69670,
    kwargs=kwargs@entry=0x2c69678, kwcount=78, kwcount@entry=39, kwstep=kwstep@entry=2, defs=0x0, defcount=0, kwdefs=0x0,
    closure=0x0, name='__call__', qualname='Command.__call__') at Python/ceval.c:3930
#170 0x000000000043b027 in _PyFunction_FastCallDict (func=func@entry=<function at remote 0x7fb3d89a1a60>,
    args=args@entry=0x7ffdbea271d0, nargs=nargs@entry=1,
    kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:376
#171 0x000000000043e749 in _PyObject_FastCallDict (
    kwargs={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, nargs=1, args=0x7ffdbea271d0, callable=<function at remote 0x7fb3d89a1a60>) at Objects/call.c:98
#172 _PyObject_Call_Prepend (callable=callable@entry=<function at remote 0x7fb3d89a1a60>,
    obj=obj@entry=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated),
    args=args@entry=(),
    kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:904
---Type <return> to continue, or q <return> to quit---
#173 0x0000000000494665 in slot_tp_call (
    self=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated), args=(),
    kwds={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None})
    at Objects/typeobject.c:6379
#174 0x000000000043c638 in PyObject_Call (
    callable=callable@entry=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated),
    args=args@entry=(),
    kwargs=kwargs@entry={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}) at Objects/call.c:245
#175 0x0000000000422479 in do_call_core (
    kwdict={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child': 1, 'max_memory_per_child': None, 'purge': False, 'queues': 'bgl', 'exclude_queues': [], 'include': [], 'without_gossip': False, 'without_mingle': False, 'without_heartbeat': False, 'heartbeat_interval': None, 'autoscale': '10,3', 'logfile': '/var/log/ae-worker/ae-worker-3%I.log', 'pidfile': '/run/celery/ae-worker-3.pid', 'uid': None, 'gid': None, 'umask': None, 'executable': None, 'beat': False, 'schedule_filename': 'celerybeat-schedule', 'scheduler': None}, callargs=(),
    func=<worker(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_thread.lock at remote 0x7fb3d85114e0>, _pending=<collections.deque at remote 0x7fb3d85766c8>, _tasks=<TaskRegistry at remote 0x7fb3d855ab48>, _using_v1_reduce=None, _preconf={'include': ['ae_worker.ta...(truncated)) at Python/ceval.c:4645
#176 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3191
#177 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3d7a13808, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/worker.py, line 223, in run_from_argv (prog_name='celery', argv=['-n', 'ae-worker-3.%h', '-A', 'ae_worker.app', '--pidfile=/run/celery/ae-worker-3.pid', '--autoscale=10,3', '--max-tasks-per-child', '1', '-Q', 'bgl', '--loglevel=INFO', '--logfile=/var/log/ae-worker/ae-worker-3%I.log', '--soft-time-limit=7200', '--time-limit=7300', '-Ofair', '--max-tasks-per-child', '1', '-E'], command='worker', options={'app': 'ae_worker.app', 'broker': None, 'result_backend': None, 'loader': None, 'config': None, 'workdir': None, 'no_color': None, 'quiet': False, 'hostname': 'ae-worker-3.%h', 'detach': False, 'statedb': None, 'loglevel': 'INFO', 'optimization': 'fair', 'prefetch_multiplier': 1, 'concurrency': 0, 'pool': 'prefork', 'task_events': True, 'time_limit': <float at remote 0x7fb3d7a32078>, 'soft_time_limit': <float at remote 0x7fb3d9031d38>, 'max_tasks_per_child':...(truncated)) at Python/ceval.c:547
#178 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8970300>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=3, kwnames=0x7fb3d8a12f78, kwargs=kwargs@entry=0x7fb3d54143f0, kwcount=1, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3d8973660, defcount=defcount@entry=2, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
    name='run_from_argv', qualname='worker.run_from_argv') at Python/ceval.c:3930
#179 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#180 0x0000000000428c25 in call_function (kwnames=('command',), oparg=<optimized out>, pp_stack=<synthetic pointer>)
    at Python/ceval.c:4616
#181 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3139
#182 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3d5414248, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 420, in execute (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<...(truncated)) at Python/ceval.c:547
#183 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d89f7540>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=3, kwnames=0x0, kwargs=kwargs@entry=0x2957570, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3d89a54f8, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='execute',
    qualname='CeleryCommand.execute') at Python/ceval.c:3930
#184 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#185 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#186 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#187 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x29573b8, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 488, in handle_argv (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_per---Type <return> to continue, or q <return> to quit---
iodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _finalize_mutex=<_...(truncated)) at Python/ceval.c:547
#188 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8a045d0>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=3, kwnames=0x0, kwargs=kwargs@entry=0x7fb3d897c1f0, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x0, defcount=defcount@entry=0, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0, name='handle_argv',
    qualname='CeleryCommand.handle_argv') at Python/ceval.c:3930
#189 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#190 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#191 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#192 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3d897c048, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/base.py, line 298, in execute_from_commandline (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, _f...(truncated)) at Python/ceval.c:547
#193 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3e6172420>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x7fb3e60f0f18, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3e6174bc0, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
    name='execute_from_commandline', qualname='Command.execute_from_commandline') at Python/ceval.c:3930
#194 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#195 0x0000000000428b80 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#196 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3093
#197 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x7fb3e60f0d68, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/lib/python3.7/site-packages/celery/bin/celery.py, line 496, in execute_from_commandline (self=<CeleryCommand(app=<Celery(clock=<LamportClock(value=33276810, mutex=<DummyLock at remote 0x7fb3cf2b0358>) at remote 0x7fb3d85155c0>, main='ae_worker', amqp_cls='celery.app.amqp:AMQP', events_cls='celery.app.events:Events', loader_cls='celery.loaders.app:AppLoader', log_cls='celery.app.log:Logging', control_cls='celery.app.control:Control', task_cls='celery.app.task:Task', set_as_current=True, registry_cls=<type at remote 0x2918d18>, user_options={'preload': set(), 'worker': set()}, steps={'worker': set(), 'consumer': set()}, autofinalize=True, namespace=None, strict_typing=True, configured=True, _config_source='ae_worker.config.production.CeleryConfig', _pending_defaults=<collections.deque at remote 0x7fb3d898a730>, _pending_periodic_tasks=<collections.deque at remote 0x7fb3d898a3f0>, finalized=True, ...(truncated)) at Python/ceval.c:547
#198 _PyEval_EvalCodeWithName (_co=<code at remote 0x7fb3d8a06780>, globals=<optimized out>, locals=locals@entry=0x0,
    args=<optimized out>, argcount=2, kwnames=0x0, kwargs=kwargs@entry=0x274a150, kwcount=0, kwstep=kwstep@entry=1,
    defs=defs@entry=0x7fb3d89a55a0, defcount=defcount@entry=1, kwdefs=kwdefs@entry=0x0,
    closure=closure@entry=(<cell at remote 0x7fb3d895e5e8>,), name='execute_from_commandline',
    qualname='CeleryCommand.execute_from_commandline') at Python/ceval.c:3930
#199 0x000000000043b238 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:433
#200 0x00000000004280f2 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#201 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3110
#202 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=1, globals=<optimized out>)
    at Objects/call.c:283
#203 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#204 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#205 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#206 0x00000000004200a8 in function_code_fastcall (co=<optimized out>, args=<optimized out>, nargs=0, globals=<optimized out>)
    at Objects/call.c:283
#207 0x000000000043b306 in _PyFunction_FastCallKeywords (func=<optimized out>, stack=<optimized out>, nargs=<optimized out>,
    kwnames=<optimized out>) at Objects/call.c:415
#208 0x0000000000427d48 in call_function (kwnames=0x0, oparg=<optimized out>, pp_stack=<synthetic pointer>) at Python/ceval.c:4616
#209 _PyEval_EvalFrameDefault (f=<optimized out>, throwflag=<optimized out>) at Python/ceval.c:3124
#210 0x00000000004eba97 in PyEval_EvalFrameEx (throwflag=0,
    f=Frame 0x26c8708, for file /app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery, line 8, in <module> ()) at Python/ceval.c:547
#211 _PyEval_EvalCodeWithName (_co=_co@entry=<code at remote 0x7fb3de9fb300>,
    globals=globals@entry=<unknown at remote 0x7fb3de9e4648>,
    locals=locals@entry='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery',
    args=args@entry=0x0, argcount=argcount@entry=0, kwnames=kwnames@entry=0x0, kwargs=kwargs@entry=0x0, kwcount=kwcount@entry=0,
    kwstep=kwstep@entry=2, defs=defs@entry=0x0, defcount=defcount@entry=0, kwdefs=kwdefs@entry=0x0, closure=closure@entry=0x0,
    name=name@entry=0x0, qualname=qualname@entry=0x0) at Python/ceval.c:3930
#212 0x00000000004ebbc0 in PyEval_EvalCodeEx (closure=0x0, kwdefs=0x0, defcount=0, defs=0x0, kwcount=0, kws=0x0, argcount=0,
    args=0x0,
    locals=locals@entry='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery',
    globals=globals@entry=<unknown at remote 0x7fb3de9e4648>, _co=_co@entry=<code at remote 0x7fb3de9fb300>) at Python/ceval.c:3959
#213 PyEval_EvalCode (co=co@entry=<code at remote 0x7fb3de9fb300>,
    globals=globals@entry={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
    locals=locals@entry={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>}) at Python/ceval.c:524
#214 0x00000000005265f8 in run_mod (arena=0x7fb3e6282078, flags=0x7ffdbea285f0,
    locals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
    globals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
    filename='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', mod=0x26c1cc8)
    at Python/pythonrun.c:1035
#215 PyRun_FileExFlags (fp=0x26f05c0, filename_str=<optimized out>, start=<optimized out>,
    globals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': ---Type <return> to continue, or q <return> to quit---
None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>},
    locals={'__name__': '__main__', '__doc__': None, '__package__': None, '__loader__': <SourceFileLoader(name='__main__', path='/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery') at remote 0x7fb3de9f2710>, '__spec__': None, '__annotations__': {}, '__builtins__': <module at remote 0x7fb3e627ac28>, '__file__': '/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery', '__cached__': None, 're': <module at remote 0x7fb3de9ac728>, 'sys': <module at remote 0x7fb3e6274f98>, 'main': <function at remote 0x7fb3de9b9ea0>}, closeit=1, flags=0x7ffdbea285f0) at Python/pythonrun.c:988
#216 0x00000000005267dd in PyRun_SimpleFileExFlags (fp=0x26f05c0, filename=<optimized out>, closeit=1, flags=0x7ffdbea285f0)
    at Python/pythonrun.c:429
#217 0x000000000042fdfd in pymain_run_file (p_cf=0x7ffdbea285f0,
    filename=0x2655d10 L"/app/aep/.local/share/virtualenvs/a02e6b6588a005e3a7627e805171154cf77d15b0-n-emIVEOn1/bin/celery",
    fp=0x26f05c0) at Modules/main.c:427
#218 pymain_run_filename (cf=0x7ffdbea285f0, pymain=0x7ffdbea286d0) at Modules/main.c:1627
#219 pymain_run_python (pymain=0x7ffdbea286d0) at Modules/main.c:2877
#220 pymain_main (pymain=pymain@entry=0x7ffdbea286d0) at Modules/main.c:3038
#221 0x00000000004300cd in _Py_UnixMain (argc=<optimized out>, argv=<optimized out>) at Modules/main.c:3073
#222 0x00007fb3e516c495 in __libc_start_main () from /lib64/libc.so.6
#223 0x0000000000429df4 in _start ()

Dependencias:

celery==4.3.0
kombu==4.6.5
billard==3.6.1.0
amqp==2.5.1
python==3.7.3

¿puedes probar apio == 4.4.0rc4?

Me tomará una semana examinarlo en la puesta en escena, pero intentaré que algo de carga se ejecute en producción.

@ wakemaster39 : ¿Esto ayudó?

También es curioso si esto parece haberlo solucionado @ wakemaster39

apio == 4.4.0rc5?

Me obligaron a resolver diferentes problemas las últimas dos semanas y el sitio con el que estamos teniendo problemas se desconectará durante las vacaciones, por lo que no quiero hacer ningún cambio allí. Lo estoy examinando en la puesta en escena para la próxima semana en la mitad con la esperanza de implementarlo para impulsar el año nuevo y obtener algo de carga en su contra.

¿Sabemos si este problema es un problema con los trabajadores del apio específicamente y no con otras partes del apio como beat ? Ergo, si cambio a algo como go-celery y sigo usando celery.beat ¿eso teóricamente soluciona mi problema? ¿O la gente ha experimentado problemas con beat estancan por cualquier motivo también?

En mi caso, ni siquiera es el ritmo el que muere, sus hilos de apio se bloquean. Lo arreglé para mí agregando un tiempo de espera, pero de vez en cuando el hilo se detiene. Sucedió mucho más cuando tuve algunas transacciones de base de datos que no eran atómicas, pero arreglar eso solucionó algunos de mis dolores de cabeza.

tldr: no es beat, pero trabajador que "cuelga aleatoriamente en IPC"

Visto cosas similares corriendo apio = "== 4.4.0". Beat funciona bien, pero el apio se atasca por alguna razón. Posiblemente un punto muerto pero muy difícil de decir

Tuve una experiencia similar a @ vincent31337 . Creo que esto se evita al no usar el comando celery multi start .

Estoy enfrentando el mismo problema. Python2.7, Apio4.1.0. Yo uso el corredor de SQS. El apio procesa tareas durante algún tiempo y se bloquea abruptamente después de un tiempo.

debería probar las versiones más recientes.

Se corrige después de la actualización.

Me gustaría saber si sigue siendo el caso con apio => 4.4.x

Estoy lidiando con este mismo problema:

pitón 3.8
Django 3.0.3
apio 4.4.0
redis 3.4.1 (para broker y resultados)

Dirijo a cinco trabajadores. Puedo ejecutar cinco tareas que se completan, luego el trabajador cuelga en read([different pipe value for each] . Después de eso, las tareas son aceptadas y aparecen en Inspect(app=app).active() , sin embargo, su worker_pid no es uno de los pids de los trabajadores. A menudo, los pids tienen valores bajos como 50. Usando ps -p [low pid] no veo esos pids realmente ejecutándose. No he visto eso mencionado todavía en este hilo.

incluso con apio == 4.4.2 ¿verdad?

Nos enfrentamos exactamente al mismo problema: con apio == 4.3 y probando el último apio = 4.4.2
estamos usando rabbitmq como backend (corredor y resultados)

kombu == 4.6.8
apio == 4.4.2
amqp == 2.5.2
billar == 3.6.3.0
librabbitmq == 2.0.0
Python 3.7.4

el apio se atasca en la lectura y no puede procesar ninguna otra tarea.

$ sudo strace -p 8553
strace: Process 8553 attached
read(10, ^Cstrace: Process 8553 detached
 <detached ...>

$ sudo strace -p 8557
strace: Process 8557 attached
read(14, ^Cstrace: Process 8557 detached
 <detached ...>

$ sudo strace -p 8561
strace: Process 8561 attached
read(18, ^Cstrace: Process 8561 detached
 <detached ...>

Un ejemplo es nuestra entrada para el proceso 8553
celery 8553 ubuntu 10r FIFO 0,10 0t0 43059 pipe

FWIW, parece que hemos tenido el mismo problema con un uso crudo de billiard==3.6.3.0 (independientemente del apio):
padre:

(gdb) py-bt                                                                                
Traceback (most recent call first):             
  <built-in method acquire of _thread.lock object at remote 0x7f838895e620>
  File "/usr/lib/python3.6/threading.py", line 295, in wait                           
    waiter.acquire()
  File "/usr/local/lib/python3.6/dist-packages/billiard/pool.py", line 1957, in next
    self._cond.wait(timeout)
  File "xxx.py", line 1063, in main
    for res in self.pool.imap_unordered(_partial, itertools.islice(a_generator, nb_processes)):
...

proceso de niño atascado:

0x00007f8393cc11c4 in __GI___libc_read (fd=21, buf=0x7f838892c578, nbytes=4) at ../sysdeps/unix/sysv/linux/read.c:27
Traceback (most recent call first):
  <built-in method read of module object at remote 0x7f8392d3de58>
  File "/usr/local/lib/python3.6/dist-packages/billiard/connection.py", line 422, in _recv
    chunk = read(handle, remaining)
  File "/usr/local/lib/python3.6/dist-packages/billiard/connection.py", line 456, in _recv_bytes
    buf = self._recv(4)
  File "/usr/local/lib/python3.6/dist-packages/billiard/connection.py", line 243, in recv_bytes
    buf = self._recv_bytes(maxlength)
  File "/usr/local/lib/python3.6/dist-packages/billiard/queues.py", line 395, in get_payload
    return self._reader.recv_bytes()
  File "/usr/local/lib/python3.6/dist-packages/billiard/pool.py", line 445, in _recv
    return True, loads(get_payload())
...

Esta es la misma pila que en https://github.com/celery/celery/issues/4185#issuecomment -554828564 py-bt stack trace or forked process

todos los demás procesos hijos están en semlock:

Traceback (most recent call first):
  <built-in method __enter__ of _multiprocessing.SemLock object at remote 0x7f838896ff10>
  File "/usr/local/lib/python3.6/dist-packages/billiard/synchronize.py", line 116, in __enter__
    return self._semlock.__enter__()
  File "/usr/local/lib/python3.6/dist-packages/billiard/queues.py", line 394, in get_payload
    with self._rlock:
  File "/usr/local/lib/python3.6/dist-packages/billiard/pool.py", line 445, in _recv
    return True, loads(get_payload())
...

Observé el problema de los trabajadores que se vuelven obsoletos en varias instancias de OpenWISP.

celery==4.4.7
kombu==4.6.11
billiard==3.6.3.0

Mando:

celery -A openwisp2 worker -l info --pool=gevent --concurrency=10 -Ofair

La única diferencia que puedo ver entre un trabajador que trabaja y uno que no trabaja es que el trabajador que no está procesando ninguna tarea muestra esto en la salida strace :

select(22, [21], [], [21], {tv_sec=0, tv_usec=0}) = 0 (Timeout)

Incluso si el trabajador está procesando tareas, veo estos dos tipos de líneas en la salida de strace:

epoll_ctl(4, EPOLL_CTL_ADD, 20, {EPOLLIN, {u32=20, u64=12884901908}}) = -1 EEXIST (File exists)
recvfrom(20, 0x377a460, 65536, 0, NULL, NULL) = -1 EAGAIN (Resource temporarily unavailable)

Por lo tanto, no creo que estas dos últimas líneas de registro indiquen un problema subyacente, o al menos que no sean síntomas del problema.

# strace -p 81
strace: Process 81 attached
read(24, ^Cstrace: Process 81 detached
 <detached ...>
# lsof -p 81 | grep 24
celery   81 root  mem       REG              252,1             535124 /usr/local/lib/python3.6/lib-dynload/_ctypes.cpython-36m-x86_64-linux-gnu.so (path dev=0,112)
celery   81 root   24r     FIFO               0,13      0t0 311875006 pipe



md5-a6754624afaf3bd93564d53ec027c759



python version: 3.6
celery==4.4.7
kombu==4.6.11
billiard==3.6.3.0

Hola,

He estado lidiando con el mismo problema aquí:

informe de inspección de apio:

software -> celery:4.3.0 (rhubarb) kombu:4.6.4 py:3.6.6
                    billiard:3.6.1.0 py-amqp:2.5.1
        platform -> system:Linux arch:64bit
                    kernel version:3.10.0-1062.18.1.el7.x86_64 imp:CPython
        loader   -> celery.loaders.app.AppLoader
        settings -> transport:amqp results:amqp

        broker_url: 'amqp://nextgen:********@*:5672/ng_vhost'
        result_backend: 'amqp'
        task_queues: 
            (<unbound Queue default -> <unbound Exchange default(direct)> -> default>,
         <unbound Queue runForecasts -> <unbound Exchange runForecasts(direct)> -> runForecasts>,
         <unbound Queue runEnsembles -> <unbound Exchange runEnsembles(direct)> -> runEnsembles>)
        task_default_queue: 'default'
        task_default_exchange: 'default'
        task_default_routing_key: '********'
        broker_heartbeat: None
        task_acks_late: True
        worker_prefetch_multiplier: 1

CELERYD_OPTS:
--time-limit=3600 -c:worker1 3 -Q:worker1 runForecasts -c:worker2 2 -Q:worker2 runEnsembles -c:worker3 2 -Q:worker3 default

Solo la cola runForecasts parece atascarse, y aquí está la salida de strace para los procesos secundarios:

$ sudo strace -p 17426
strace: Process 17426 attached
read(28, ^Cstrace: Process 17426 detached
 <detached ...>
$ sudo ls -l /proc/17426/fd/28
lr-x------. 1 wind wind 64 Nov 10 22:45 /proc/17426/fd/28 -> pipe:[284399927]

Encontré la tubería en la que está atascado, pero no sé cómo depurar más. Agradecería que alguien pudiera arrojar algo de luz sobre esto.

# strace -p 32
strace: Process 32 attached
epoll_wait(7, [], 1023, 787)            = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
wait4(631, 0x7ffe2bd33c3c, WNOHANG, NULL) = 0
epoll_wait(7, [], 1023, 4050)           = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
epoll_wait(7, [{EPOLLIN, {u32=15, u64=139972984176655}}], 1023, 275) = 1
ioctl(15, FIONBIO, [1])                 = 0
recvfrom(15, "\10\0\0\0\0\0\0", 7, 0, NULL, NULL) = 7
recvfrom(15, "\316", 1, 0, NULL, NULL)  = 1
ioctl(15, FIONBIO, [0])                 = 0
ioctl(15, FIONBIO, [1])                 = 0
recvfrom(15, 0x7f4e96911aa0, 7, 0, NULL, NULL) = -1 EAGAIN (Resource temporarily unavailable)
ioctl(15, FIONBIO, [0])                 = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
epoll_wait(7, [], 1023, 19)             = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
epoll_wait(7, [], 1023, 670)            = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
wait4(631, 0x7ffe2bd33c3c, WNOHANG, NULL) = 0
epoll_wait(7, [], 1023, 5000)           = 0
epoll_ctl(7, EPOLL_CTL_DEL, 43, 0x7ffe2bd34534) = -1 ENOENT (No such file or directory)
wait4(631, 0x7ffe2bd33c3c, WNOHANG, NULL) = 0
epoll_wait(7, [], 1023, 5000)           = 0
lrwx------. 1 root root 64 Dec  3 08:21 7 -> 'anon_inode:[eventpoll]'

apio == 4.4.7

argumentos de trabajador: --autoscale y --max-tasks-per-child

Probaré esto.

Yo uso --autoscale 20,1 y --max-tasks-per-child 1

tarea

<strong i="10">@share_task</strong>
def echo():
      print(1)

# wait for processes scaling down and run next
for i in range(100):
     echo.delay()


# once large call
for i in range(10000):
     echo.delay()

Recibí otro error:

[2020-12-03 18:03:42,137: CRITICAL/MainProcess] Unrecoverable error: AssertionError()
Traceback (most recent call last):
  File "/usr/local/lib/python3.6/site-packages/celery/worker/worker.py", line 208, in start
    self.blueprint.start(self)
  File "/usr/local/lib/python3.6/site-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/usr/local/lib/python3.6/site-packages/celery/bootsteps.py", line 369, in start
    return self.obj.start()
  File "/usr/local/lib/python3.6/site-packages/celery/worker/consumer/consumer.py", line 318, in start
    blueprint.start(self)
  File "/usr/local/lib/python3.6/site-packages/celery/bootsteps.py", line 119, in start
    step.start(parent)
  File "/usr/local/lib/python3.6/site-packages/celery/worker/consumer/consumer.py", line 599, in start
    c.loop(*c.loop_args())
  File "/usr/local/lib/python3.6/site-packages/celery/worker/loops.py", line 83, in asynloop
    next(loop)
  File "/usr/local/lib/python3.6/site-packages/kombu/asynchronous/hub.py", line 303, in create_loop
    poll_timeout = fire_timers(propagate=propagate) if scheduled else 1
  File "/usr/local/lib/python3.6/site-packages/kombu/asynchronous/hub.py", line 145, in fire_timers
    entry()
  File "/usr/local/lib/python3.6/site-packages/kombu/asynchronous/timer.py", line 68, in __call__
    return self.fun(*self.args, **self.kwargs)
  File "/usr/local/lib/python3.6/site-packages/celery/concurrency/asynpool.py", line 636, in verify_process_alive
    assert proc.outqR_fd in hub.readers
AssertionError

DE https://github.com/celery/celery/issues/4185#issuecomment -694179347
Revisé la línea de billiard / pool.py 1957 y encontré:

    self._cond = threading.Condition(threading.Lock())
    ......
    self._lost_worker_timeout = lost_worker_timeout
    ......
    def next(self, timeout=None):
        with self._cond:
                ......
                self._cond.wait(timeout)
                ......

self._cond.wait no establece el tiempo de espera. ¿Está atrapado en este lugar?

Para aquellos que enfrentan esto, ¿pueden probar https://github.com/celery/billiard/pull/325?

https://github.com/celery/billiard/pull/325 puede no funcionar.

Y me rindo. Ahora, uso --pool threads

¿Fue útil esta página
0 / 5 - 0 calificaciones