'Flask on Azure App Service: Run Celery worker in the background for a task queue

I am running a Flask application on Azure App Service. I want to incorporate a task queue for long-running actions. My setup includes an App Service application and a Azure Cache for Redis that I am using for multiple purposes (cache, session etc.). So I know the redis instance works and is accessible to the application.

My task queue setup includes using Celery with Redis as the broker as well as the results backend. The basic setup is as follows:

  1. A JS function, through AJAX, sends a request to a Flask end point, which creates a task for the long-running returns a task ID. In the AJAX call's success callback, I poll the server with the task ID to get the status and eventually the result. All of this works fine on local (Windows w/ eventlet).

Problem:
I am unable to get a celery worker running on the server.

My startup command is in a azure-startup-command.txt file in the application root, which I have set as the startup command using az webapp config set --startup-file "azure-startup-command.txt" I found this example of Django on App Service, based on which I have used the following variations (changing the order and using the --detached option) in this startup file:

  1. celery -A application.celery worker -l DEBUG & gunicorn -t 600 -w 5 -b :$PORT application:application
  2. celery -A application.celery worker -l DEBUG --detach & gunicorn -t 600 -w 5 -b :$PORT application:application
  3. gunicorn -t 600 -w 5 -b :$PORT application:application & celery -A application.celery worker -l DEBUG --detach
  4. gunicorn -t 600 -w 5 -b :$PORT application:application & celery -A application.celery worker -l DEBUG

In all cases, the application runs and is accessible; but the task does not start. So I get a task ID from the task request, but on polling with the task ID, I keep getting a PENDING status response. This btw, is the same response I get on local, if I don't run the celery worker.

Questions:

  1. What is the right way to run the celery worker in the background, on the server, with the startup command/file method? I didn't find any documentation about this particular setup (Flask + Azure App Service).
  2. If this can't be done with the startup file method, what other alternatives should I look into? Options I have considered are:
    • Put the celery command into a different script file and run it using the PRE_BUILD_COMMAND or POST_BUILD_COMMAND settings.
    • Use supervisord to daemonize the celery worker.

I'm not comfortable with Unix scripts, so haven't explored these other options, in detail, yet. I'm trying to see if the startup command/file method (which seems simpler than the other 2 options) can work with some changes to the command I am using. If that's not possible, then I guess I won't have any choice but to delve into the supervisord method.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source