Postgresql and Mysql Docker containers

DB, docker, mysql, postgresql

To quickly get an instance of PostgreSQL and MySQL up and running, use the following docker-compose setup:

Create a subdirectory to place the docker-compose.yml and optionally the data files for the DBs:

Windows

cd %USERPROFILE%
mkdir dbs\data\mysql
mkdir dbs\data\psql
cd dbs

Others

cd ~
mkdir -p dbs/data/mysql
mkdir -p dbs/data/psql
cd dbs

Add this docker-compose.yml to start with:

version: '3.1'
services:
  mysql:
    image: mysql
    command: --default-authentication-plugin=mysql_native_password
    restart: always
    environment:
      MYSQL_ROOT_PASSWORD: password
    ports:
      - "3306:3306"
    expose:
      - "3306"
    volumes:
      - ./data/mysql:/var/lib/mysql
  postgresql:
    image: postgres
    environment:
      - POSTGRES_DB=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=password
    ports:
      - "5432:5432"
    expose:
      - "5432"
    volumes:
      - ./data/psql:/var/lib/postgresql/data

To bring them up:

docker-compose up

By default:

  • The PostgreSQL instance uses postgres / password as the admin.
  • The MySQL instance uses root / password for the admin.

Django Dev Env w/ Docker Compose

django, docker, postgresql, programming

A few years back I ran into a problem when working with Django on Windows while my colleagues were on Mac OS where a datetime routine (forgot which one) behaved differently between us. Even after syncing on the version of Python and Django between us, the discrepancy still existed. Turns out it’s due to the difference between Python on Windows vs. Python on Mac OS. We ended up working around it by not using that routine.

Thinking back now, I guess the problem could’ve been avoided if we used Docker or Vagrant or similar so that we at least are all on the same environment. It’s the type of thing that “real” work environments would’ve been. But since we were working on that project on our own as a hobby, we didn’t think too much about it.

ALSO: Docker Desktop or even Linux on Windows Home was not available at the time, so most likely I would’ve had to wrestle w/ Docker Toolbar and VirtualBox which still had problems with host volumes.

UPDATE: this post has been updated on 2022-05 based on new learnings.

Setting Up Environment in Docker

If I were to do it now, this is how I would do it:

  • Create a subdirectory for DB data. We were using PostgreSQL, so I would create something like C:\dbdata\ and use host volume to mount it to the container’s /var/lib/postgresql/data.
  • Use the postgres and python:3 base images from Docker Hub.

Step-by-step, here’s how I would set it up:

Project scaffold

NOTE: the following is using “myproject” as the name of the Django project. Replace it with the name of your Django project as appropriate.

cd dev/projects
mkdir dj

Create two starter versions of Dockerfile and docker-compose.yml:

Dockerfile

FROM python:3.7-buster
ENV PYTHONUNBUFFERED 1

WORKDIR /code
#COPY Pipfile Pipfile.lock /code/
#
RUN pip install pipenv
#RUN pipenv install

docker-compose.yml

version: '3'
services:
  app:
    build: .
#    command: >
#      sh -c "pipenv run python manage.py migrate &&
#             pipenv run python manage.py runserver 0.0.0.0:8000"
    ports:
      - "8000:8000"
    expose:
      - "8000"
    volumes:
      - ./:/code
    tty: true
    stdin_open: true

Then build and start up the containers:

docker-compose build
docker-compose run --rm app /bin/bash
pipenv install
pipenv install django 
pipenv install <other stuff as needed>

pipenv run django-admin startproject myproject .
pipenv run django-admin startapp myapp

Now uncomment the lines previously commented in Dockerfile and docker-compose.yml.

PostgreSQL Setup

Modify myapp/settings.py to use PostgreSQL:

...
DATABASES = {
    #'default': {
    #    'ENGINE': 'django.db.backends.sqlite3',
    #    'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
    #}
    'default': {
        'ENGINE': 'django.db.backends.postgresql',
        'NAME': 'postgres',
        'USER': 'postgres',
        'PASSWORD': 'postgres',
        'HOST': 'db',  # MUST match the service name for the DB
        'PORT': 5432,
    }
}
...

All pipenv-related operations should be done inside the container.

docker-compose run --rm app /bin/bash
pipenv install psycopg2-binary

Modify docker-compose.yml to bring up the DB and app containers:

version: '3'
services:
  # service name must match the HOST in myproject/settings.py's
  db:
    image: postgres
    environment:
      # Must match the values in myproject/settings.py's DATABASES
      - POSTGRES_DB=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
      # Put the DB data for myproject under myproject_db 
      # so that I can add more projects later
      - PGDATA=/var/lib/postgresql/data/myproject_db
    ports:
      - "5432:5432"
    expose:
      - "5432"
    volumes:
      # host volume where DB data are actually stored
      - c:/dbdata:/var/lib/postgresql/data
  app:
    build: .
    command: >
      sh -c "pipenv run python manage.py migrate &&
             pipenv run python manage.py runserver 0.0.0.0:8000"
    ports:
      - "8000:8000"
    expose:
      - "8000"
    volumes:
      - ./:/code
    depends_on:
      - db

The above:

  • sets up two “services” (containers): a “db” service for the DB in addition to the “app” service for the app.
  • sets up a host mount (for the “db” service) of c:\dbdata to the container’s /var/lib/postgresql/data where PostgreSQL stores/uses data for the DBs. This will allow the data to persist beyond the container’s life time.
  • sets up the PGPATH environment variable that specifies to PostgreSQL the data subdirectory to be /var/lib/postgresql/data/myproject_db which, because of the mount, will end up as c:\dbdata\myproject_db on my Windows host. This allows c:\dbdata to be used as a parent subdirectory for multiple project DBs.

Bring Up The Environment

Just run:

docker-compose up app --build

The above will:

  • Build the images and start the containers for the db and web services.
  • Initialize a new empty PostgreSQL database.
  • Run the Django migrations to prime the database for Django.
  • Run the app and have it listen on port 8000.

NOTE: there may be a race condition in the first run where the DB is still being build/initialize before the web service is starting.

This error happens in that case:

web_1 | psycopg2.OperationalError: could not connect to server: Connection refused
web_1 | Is the server running on host "db" (172.19.0.2) and accepting
web_1 | TCP/IP connections on port 5432?

Just wait until the “db_1” service is finished, hit CTRL-C, and run the

docker-compose up app --build

command again. It should now work fine.

Optionally, start up the “db” service first in the background, then start up the “web” service:

docker-compose up -d db
docker-compose up app