'How to keep spawning task until one of that task will satisfied condition
community! Need some help to understand the logic of Celery.
I want to create some kind of batch processor that will accept file, will import it and then will spawn multiple processing workers which will return processing progress. Any of that worker can return the "job is done" result, after what the final task should be launched (to export the result).
As I understand, for the begin it should be a chain of process initializator and importer. But then, it seems, like it should be some kind of spawner that controls the result of spawned tasks... or how it should be in celery paradigm/logic.
I am planning to trigger process by API:
routs.py
@routes.route('/upload', methods = ['POST'])
def upload():
...
chain(
c_app.signature('workers.init', args=[path_to_write]),
c_app.signature('workers.imp'),
#task_spawner or what?
).apply_async()
...
workers.py
app = Celery(
'workers',
broker=f'redis://{brocker_host}:6379/0',
backend=f'redis://{brocker_host}:6379/0',
)
@app.task()
def init(related_file_path:str):
return InitialWorker(related_file_path).do()
@app.task()
def imp(job_id:int):
Importer(job_id).do()
return job_id
@app.task()
def spawn_processors(job_id:int):
# what here?
res = Processor(job_id).do()
return res
Thans.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
