How to Connect Django to PostgreSQL in Production
Connecting Django to PostgreSQL in production is more than changing the DATABASES setting.
Problem statement
Connecting Django to PostgreSQL in production is more than changing the DATABASES setting. You need a database user with the right privileges, a supported PostgreSQL driver in the deploy environment, secure secret handling, correct SSL settings where required, a safe migration sequence, and a way to verify that production traffic is actually hitting PostgreSQL.
This guide shows a practical Django PostgreSQL production setup for Linux servers and container-based deployments. It focuses on connection settings, secret handling, SSL, migrations, verification, and rollback planning.
If you are replacing SQLite in an existing app, there is an extra risk: creating the PostgreSQL schema is not the same as moving existing data. This guide calls that out explicitly so you do not cut over production with an empty database.
Quick answer
To connect Django to PostgreSQL in production safely:
- Create a dedicated PostgreSQL database and app user.
- Install a supported PostgreSQL driver in the deploy environment.
- Configure Django
DATABASESto use PostgreSQL explicitly. - Load credentials from environment variables or a secrets manager.
- Enable SSL settings that match your provider, preferably with certificate verification for remote databases.
- Test connectivity from the same environment your app server uses.
- Run migrations carefully, knowing that
migratecreates schema but does not transfer existing SQLite data. - Restart the app server and verify real reads and writes.
Step-by-step solution
Step 1 — Prepare PostgreSQL for production use
Create the production database and application user
Create a dedicated database and role for the app instead of using the PostgreSQL superuser.
CREATE DATABASE myapp_prod;
CREATE USER myapp_user WITH PASSWORD 'replace-with-strong-password';
GRANT CONNECT ON DATABASE myapp_prod TO myapp_user;
If you use the default public schema, connect to the database and grant schema privileges:
\c myapp_prod
GRANT USAGE, CREATE ON SCHEMA public TO myapp_user;
For many Django deployments, making the app user the owner of the database and the objects it creates is simpler operationally than managing granular DDL privileges manually.
ALTER DATABASE myapp_prod OWNER TO myapp_user;
That avoids common migration failures caused by missing permissions for schema changes.
Grant only the required privileges
Use a dedicated role per app and avoid reusing an admin account. The app usually needs to read, write, and run migrations, but it should not have broader cluster-level privileges than necessary.
Confirm host, port, database name, username, and SSL requirements
Before changing Django settings, confirm:
- database host
- port, usually
5432 - database name
- username
- password
- whether SSL is required
- whether your provider gives you a CA certificate for verification
This matters especially for managed PostgreSQL services, which often require TLS and may expect certificate validation.
Verification check
From the app host, confirm basic TCP reachability:
nc -zv DB_HOST 5432
This only confirms that the port is reachable. It does not prove PostgreSQL authentication, SSL, or Django settings are correct.
If it fails, check firewall rules, security groups, PostgreSQL listen_addresses, and host-based access rules.
Step 2 — Install the PostgreSQL driver in Django
Choose psycopg or psycopg2-binary appropriately
For current Django deployments, prefer psycopg version 3 if your stack supports it. It is the modern PostgreSQL adapter. psycopg2-binary is still widely used and supported in many deployments.
Install one of these, not both unless you have a specific reason.
pip install "psycopg[binary]"
Or:
pip install psycopg2-binary
Add the dependency to requirements and rebuild the environment
pip freeze > requirements.txt
If you deploy with Docker, rebuild the image. If you deploy to a virtualenv on a server, reinstall dependencies there.
Verify the driver imports successfully
python -c "import psycopg; print('psycopg ok')"
Or for psycopg2:
python -c "import psycopg2; print('psycopg2 ok')"
Verification check
Run the import test in the same environment your app server uses. A successful local import does not help if Gunicorn or Uvicorn runs from a different virtualenv, image, or container.
Step 3 — Configure Django database settings for PostgreSQL
Replace SQLite defaults with a PostgreSQL DATABASES configuration
In production settings, switch the engine to PostgreSQL explicitly.
import os
db_options = {}
db_sslmode = os.environ.get("DB_SSLMODE")
db_sslrootcert = os.environ.get("DB_SSLROOTCERT")
if db_sslmode:
db_options["sslmode"] = db_sslmode
if db_sslrootcert:
db_options["sslrootcert"] = db_sslrootcert
DATABASES = {
"default": {
"ENGINE": "django.db.backends.postgresql",
"NAME": os.environ["DB_NAME"],
"USER": os.environ["DB_USER"],
"PASSWORD": os.environ["DB_PASSWORD"],
"HOST": os.environ["DB_HOST"],
"PORT": os.environ.get("DB_PORT", "5432"),
"CONN_MAX_AGE": int(os.environ.get("DB_CONN_MAX_AGE", "60")),
"CONN_HEALTH_CHECKS": True,
"OPTIONS": db_options,
}
}
Read credentials from environment variables
Do not hardcode credentials in settings.py. A simple environment contract looks like this:
DB_NAME=myapp_prod
DB_USER=myapp_user
DB_PASSWORD=replace-me
DB_HOST=10.0.0.15
DB_PORT=5432
DB_SSLMODE=verify-full
DB_SSLROOTCERT=/etc/ssl/certs/my-postgres-ca.pem
DJANGO_SETTINGS_MODULE=config.settings.production
Do not commit .env files with real credentials to version control.
Set connection options such as CONN_MAX_AGE, health checks, and SSL mode
If the PostgreSQL server is remote or managed, use the SSL settings required by your provider. Over untrusted networks, prefer certificate verification over encryption without verification.
sslmode=requireencrypts the connection but does not verify server identity.sslmode=verify-fullwithsslrootcertis stronger when your provider supplies a CA certificate.
If your provider documents a different SSL combination, follow that documentation.
Separate development and production settings cleanly
Keep SQLite in development if that fits your workflow, but isolate it from production settings. Do not leave fallback logic that silently uses SQLite in production when env vars are missing.
Verification check
Run:
python manage.py check --deploy
Step 4 — Store secrets safely in production
Use environment variables or a secrets manager
For a Linux server with systemd, secrets are commonly injected through an environment file or service environment settings. For containers, use runtime environment variables or your orchestrator’s secret mechanism.
Example systemd snippet:
[Service]
EnvironmentFile=/etc/myapp/myapp.env
Make sure that file is readable only by the appropriate account or group:
sudo chown root:myapp /etc/myapp/myapp.env
sudo chmod 640 /etc/myapp/myapp.env
Adjust the group to match the user or group your service actually runs as.
Avoid committing database credentials to the repository
Keep secrets out of:
settings.py- committed
.envfiles - CI logs
- shell history where possible
File permission and process environment considerations
Anyone who can read the deployed environment can often read database credentials. Limit shell access on the host, lock down secret files, and confirm the process manager loads the same environment that you test manually.
Step 5 — Test the database connection before deployment
Run Django system checks
python manage.py check --deploy
Open a Django shell and verify a live database query
python manage.py shell
Then:
from django.db import connection
print(connection.vendor)
with connection.cursor() as cursor:
cursor.execute("SELECT current_database(), current_user;")
print(cursor.fetchone())
You should see postgresql as the vendor and the expected database and user.
Check application startup logs for connection errors
After updating environment or settings, inspect app server logs before routing traffic:
sudo journalctl -u gunicorn -n 100 --no-pager
Use your actual service name if it is not gunicorn. In containers, use the equivalent container log command.
Rollback note
Do not remove the old release or old environment file until the new process starts successfully and database connectivity is confirmed.
Step 6 — Run migrations safely in production
Important: schema migration is not data migration
If you are switching an existing app from SQLite to PostgreSQL, python manage.py migrate creates the PostgreSQL schema but does not copy data from SQLite.
You need a separate export/import or data migration plan before cutting over production traffic. Keep the old database available until the new PostgreSQL-backed deployment is verified.
Back up the current database before schema changes
Before production migrations, take a backup or create a managed snapshot. This matters for normal PostgreSQL schema changes and also for any cutover where you are moving from SQLite or another source database.
Apply migrations with the correct settings module
python manage.py showmigrations
python manage.py migrate --noinput
If you use an explicit settings module:
python manage.py migrate --noinput --settings=config.settings.production
Verify migration status after deployment
python manage.py showmigrations
All expected migrations should be marked as applied.
Notes on zero-downtime and migration ordering
For safer releases, use backward-compatible migrations where possible:
- deploy code that is compatible with both old and new schema
- run migrations
- restart app processes
- verify health
If a migration is destructive or not backward-compatible, use an explicit multi-step rollout instead of a simple in-place deploy.
Step 7 — Integrate with the app server and release process
Ensure Gunicorn or Uvicorn inherits the correct environment
If the app server does not receive DB_* variables, Django may fail with ImproperlyConfigured or connection errors. Confirm the process manager loads the same environment you tested manually.
Order of operations in a release
A safe release sequence is:
- upload new code
- install dependencies
- load updated environment
- run checks
- verify PostgreSQL connectivity
- back up the database
- run migrations
- restart Gunicorn or Uvicorn
- verify application health
Avoid restarting incompatible code against a changed schema
Keep code and schema compatibility in mind. If a migration is destructive, plan the rollout so old and new app versions cannot conflict.
When to turn this into a reusable script or template
Once you are repeating the same environment validation, backup trigger, migration commands, and service restart on every release, this workflow is a good candidate for a script or CI/CD job. A reusable template is especially useful for validating required DB_* variables and enforcing release order before the app restarts.
Step 8 — Verify the production setup
Confirm Django is using PostgreSQL instead of SQLite
In a Django shell:
from django.conf import settings
print(settings.DATABASES["default"]["ENGINE"])
It should print:
django.db.backends.postgresql
Validate read and write behavior through the app
Test a real application path that reads from and writes to the database. Do not rely only on startup success.
Check connection stability
If you set CONN_MAX_AGE, monitor for connection drops or stale connections in logs. CONN_HEALTH_CHECKS=True helps in environments where connections may be interrupted by failover or idle timeout behavior.
Explanation
This setup works because it separates deployment concerns clearly:
- Django uses PostgreSQL through a supported adapter.
- Secrets are injected at runtime instead of stored in code.
- Production settings are explicit, so SQLite cannot be used accidentally.
- SSL options are configurable for managed or remote PostgreSQL.
- Connectivity is verified before the app is restarted.
- Migrations are treated as a release step with backup and rollback planning.
- Data movement is handled separately from schema creation when moving off SQLite.
Use a dedicated app role to reduce blast radius if credentials leak. Use verified TLS settings for remote databases when your provider supports them. Keep production settings isolated so the deploy fails fast if required database variables are missing.
Edge cases and notes
Switching from SQLite to PostgreSQL
This page covers connection and cutover safety, not full data migration tooling. If your production app already has data in SQLite, do not assume Django migrations will move it. Plan and test a separate export/import process, verify row counts and critical records, and keep the old database available until cutover is complete.
Connecting to managed PostgreSQL providers
Managed services often require SSL and may use DNS endpoints that change during failover. Use the provider’s recommended connection settings and CA certificate path if certificate verification is supported.
Using Unix sockets vs TCP host connections
If PostgreSQL runs on the same host, Unix sockets can work well. In that case, HOST may be a socket directory rather than an IP or hostname. For remote databases, use TCP.
Handling firewall or security group restrictions
If Django cannot connect, check host reachability first, then PostgreSQL listen settings, then access rules. Many database connection failures are really network policy issues.
Missing env vars and startup ordering
Watch for startup failures caused by missing env vars in systemd, containers starting before secrets are mounted, or old processes keeping stale credentials after secret rotation.
Internal links
For broader production configuration, see Django production settings best practices.
To complete the web stack after database setup, follow Deploy Django with Gunicorn and Nginx on Ubuntu.
For an ASGI deployment path, see Deploy Django ASGI with Uvicorn and Nginx.
For safer releases around schema changes, read how to run Django migrations in production.
If the app still cannot connect, use this runbook to troubleshoot Django database connection errors in production.
FAQ
Should I use psycopg or psycopg2 for Django in production?
For new deployments, psycopg is the modern choice if it fits your Django and platform versions. psycopg2-binary is still common and works well in many deployments. The important part is to use one supported adapter consistently in the same environment as your app server.
How do I store PostgreSQL credentials safely for a Django deployment?
Use environment variables or a secrets manager provided by your platform. Do not commit credentials to the repository or hardcode them in Django settings.
Do I need SSL between Django and PostgreSQL in production?
If the database is remote, managed, or on an untrusted network path, yes. Prefer certificate verification when your provider gives you a CA certificate. sslmode=require encrypts traffic, but verify-full with a trusted CA is stronger.
Does python manage.py migrate move my existing SQLite data into PostgreSQL?
No. migrate creates and updates schema in PostgreSQL, but it does not copy existing SQLite data. If you are switching databases, plan a separate data migration or export/import step before cutover.
What is the safest order for switching Django from SQLite to PostgreSQL in production?
Provision PostgreSQL first, configure Django settings and secrets, verify connectivity, create the PostgreSQL schema, migrate or import application data separately, test the new environment, then cut over traffic. Keep the old database and previous release available until the new deployment is verified.
What should I back up before running Django migrations against PostgreSQL?
Back up the production PostgreSQL database or create a provider snapshot before schema changes. If you are transitioning from SQLite, preserve the SQLite database as well until the PostgreSQL deployment and imported data are fully verified.