'airflow runs connections check against metadata db before init db

I am running airflow locally based on dockerfile, .env, docker-compose.yaml and entrypoint.sh

as "docker-compose -f docker-compose.yaml up"

And just after "airflow init db" in entrypoint.sh I am getting the following error:

[after it all is cool, I can run airflow. But this drives me crazy. Can anyone help me to resolve it, please?]

what's strange is that service queries the tables even before the db has been initiated

airflow_webserver  | initiating db
airflow_webserver  | DB: postgresql://airflow:***@airflow_metadb:5432/airflow
airflow_webserver  | [2022-02-22 13:52:26,318] {db.py:929} INFO - Dropping tables that exist
airflow_webserver  | [2022-02-22 13:52:26,570] {migration.py:201} INFO - Context impl PostgresqlImpl.
airflow_webserver  | [2022-02-22 13:52:26,570] {migration.py:204} INFO - Will assume transactional DDL.
airflow_metadb     | 2022-02-22 13:52:26.712 UTC [71] ERROR:  relation "connection" does not exist at character 55
airflow_metadb     | 2022-02-22 13:52:26.712 UTC [71] STATEMENT:  SELECT connection.conn_id AS connection_conn_id 
airflow_metadb     |    FROM connection GROUP BY connection.conn_id 
airflow_metadb     |    HAVING count(*) > 1
airflow_metadb     | 2022-02-22 13:52:26.714 UTC [72] ERROR:  relation "connection" does not exist at character 55
airflow_metadb     | 2022-02-22 13:52:26.714 UTC [72] STATEMENT:  SELECT connection.conn_id AS connection_conn_id 
airflow_metadb     |    FROM connection 
airflow_metadb     |    WHERE connection.conn_type IS NULL
airflow_webserver  | [2022-02-22 13:52:26,733] {db.py:921} INFO - Creating tables

airflow 2.2.3
postgres 13

in dockerfile:

ENTRYPOINT ["/entrypoint.sh"]

in docker-compose.yaml:

webserver:
    env_file: ./.env
    image: airflow
    container_name: airflow_webserver
    restart: always
    depends_on:
      - postgres
    environment:
      <<: *env_common
      AIRFLOW__CORE__LOAD_EXAMPLES: ${AIRFLOW__CORE__LOAD_EXAMPLES}
      AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: ${AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION}
      EXECUTOR: ${EXECUTOR}
      _AIRFLOW_DB_UPGRADE: ${_AIRFLOW_DB_UPGRADE}
      _AIRFLOW_WWW_USER_CREATE: ${_AIRFLOW_WWW_USER_CREATE}
      _AIRFLOW_WWW_USER_USERNAME: ${_AIRFLOW_WWW_USER_USERNAME}
      _AIRFLOW_WWW_USER_PASSWORD: ${_AIRFLOW_WWW_USER_PASSWORD}
      _AIRFLOW_WWW_USER_ROLE: ${_AIRFLOW_WWW_USER_ROLE}
      _AIRFLOW_WWW_USER_EMAIL: ${_AIRFLOW_WWW_USER_EMAIL}
    logging:
      options:
        max-size: 10m
        max-file: "3"
    volumes:
      - ./dags:bla-bla
      - ./logs:bla-bla
    ports:
      - ${AIRFLOW_WEBSERVER_PORT}:${AIRFLOW_WEBSERVER_PORT}
    command: webserver
    healthcheck:
      test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
      interval: 30s
      timeout: 30s
      retries: 3



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source