Very often, I find myself needing to run postgres and pgadmin locally for some side projects. Installing a server locally and then installing pgadmin is a pain, specially that I would like to use the latest version all the time. As a solution, I always run docker containers for these. To do that, I have a
docker-compose.yml file with the following:
version: "3.7" services: db: image: postgres restart: always environment: POSTGRES_DB: postgres POSTGRES_USER: admin POSTGRES_PASSWORD: password PGDATA: /var/lib/postgresql/data volumes: - db-data:/var/lib/postgresql/data ports: - "5432:5432" networks: - postgres pgadmin: image: dpage/pgadmin4 restart: always environment: PGADMIN_DEFAULT_EMAIL: [email protected] PGADMIN_DEFAULT_PASSWORD: password PGADMIN_LISTEN_PORT: 80 ports: - "8080:80" volumes: - pgadmin-data:/var/lib/pgadmin links: - "db:pgsql-server" networks: - postgres volumes: db-data: pgadmin-data: networks: postgres: driver: bridge
Then I can run
docker-compose up. That will run an instance of postgres and pgadmin4, and will enable the connectivity between them. Because I create a volume, and I do in postgres will be stored and I won’t lose it if the docker container crashes. I definitely recommend this setup.