'Django and postgresql in docker - How to run custom sql to create views after django migrate?
I'm looking to create sql views after the django tables are built as the views rely on tables created by Django models.
The problem is that, when trying to run a python script via a Dockerfile CMD calling entrypoint.sh
I get the following issue with the hostname when trying to connect to the postgresql database from the create_views.py
I've tried the following hostnames options: localhost, db, 0.0.0.0, 127.0.0.1 to no avail.
e.g.
- psycopg2.OperationalError: connection to server at "0.0.0.0", port 5432 failed: Connection refused
- could not translate host name "db" to address: Temporary failure in name resolution
- connection to server at "localhost" (127.0.0.1), port 5432 failed: Connection refused
I can't use the containers IP address as everytime you start up docker-compose up you get different IP's for the containers...
docker-compose.yml
services:
app:
container_name: django-mhb-0.3.1
build:
context: .
ports:
- "8000:8000"
volumes:
- ./myproject/:/app/
environment:
- DB_HOST=db
- DB_NAME=${POSTGRES_DB}
- DB_USER=${POSTGRES_USER}
- DB_PWD=${POSTGRES_PASSWORD}
depends_on:
- "postgres"
postgres:
container_name: postgres-mhb-0.1.1
image: postgres:14
volumes:
- postgres_data:/var/lib/postgresql/data/
# The following works. However, it runs before the Dockerfile entrypoint script.
# So in this case its trying to create views before the tables exist.
#- ./myproject/sql/:/docker-entrypoint-initdb.d/
ports:
- "5432:5432"
environment:
- POSTGRES_DB=${POSTGRES_DB}
- POSTGRES_USER=${POSTGRES_USER}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
volumes:
postgres_data:
Docker environment variables are in a .env file in the same directory as the Dockerfile and docker-compose.yml
Django secrets are in secrets.json file in django project directory
Dockerfile
### Dockerfile for Django Applications ###
# Pull Base Image
FROM python:3.9-slim-buster AS myapp
# set work directory
WORKDIR /app
# set environment variables
ENV PYTHONUNBUFFERED 1
ENV PYTHONDONTWRITEBYTECODE 1
# Compiler and OS libraries
RUN apt-get update\
&& apt-get install -y --no-install-recommends build-essential curl libpq-dev \
&& rm -rf /var/lib/apt/lists/* /usr/share/doc /usr/share/man \
&& apt-get clean \
&& useradd --create-home python
# install dependencies
COPY --chown=python:python ./requirements.txt /tmp/requirements.txt
COPY --chown=python:python ./scripts /scripts
ENV PATH="/home/python/.local/bin:$PATH"
RUN pip install --upgrade pip \
&& pip install -r /tmp/requirements.txt \
&& rm -rf /tmp/requirements.txt
USER python
# Section 5- Code and User Setup
ENV PATH="/scripts:$PATH"
CMD ["entrypoint.sh"]
entrypoint.sh
#!/bin/sh
echo "start of entrypoint"
set -e
whoami
pwd
#ls -l
#cd ../app/
#ls -l
python manage.py wait_for_db
python manage.py makemigrations
python manage.py migrate
python manage.py djstripe_sync_models product plan
python manage.py shell < docs/build-sample-data.py
## issue arises running this script ##
python manage.py shell < docs/create_views.py
python manage.py runserver 0.0.0.0:8000
create_views.py
#!/usr/bin/env python
import psycopg2 as db_connect
def get_connection():
try:
return db_connect.connect(
database="devdb",
user="devuser",
password="devpassword",
host="0.0.0.0",
port=5432,
)
except (db_connect.Error, db_connect.OperationalError) as e:
#t_msg = "Database connection error: " + e + "/n SQL: " + s
print('t_msg ',e)
return False
try:
conn = get_connection()
...
I've removed the rest of the script as it's unnecessary
When I run the Django/postgresql outside of docker on local machine localhost works fine as you would expect.
Hoping someone can help, it's doing my head in and I've spent a few days looking for a possible answwer.
Thanks
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
