Self-Hosting with Docker
Docker is the easiest way to get started with self-hosted Supabase.
Before you begin#
You need the following installed in your system:
- Docker and docker compose
- Git
Quick Start#
Get the code#
Checkout the docker directory in the Supabase repo:
# Get the code
git clone --depth 1 https://github.com/supabase/supabase
# Go to the docker folder
cd supabase/docker
# Copy the fake env vars
cp .env.example .env
# Start
docker compose up
Now visit http://localhost:3000 to start using Supabase Studio.
Securing your setup#
While we provided you with some example secrets for getting started, you should NEVER deploy your Supabase setup using the defaults we have provided.
Follow these steps to secure your Docker setup. We strongly recommend using a secrets manager when deploying to production.
Generate API Keys#
Use your JWT_SECRET
to generate a anon
and service
API keys using the JWT generator.
Replace the values in these files:
.env
:ANON_KEY
- replace with ananon
keySERVICE_ROLE_KEY
- replace with aservice
key
volumes/api/kong.yml
anon
- replace with ananon
keyservice_role
- replace with aservice
key
Update Secrets#
Update the .env
file with your own secrets. In particular, these are required:
POSTGRES_PASSWORD
: the password for thepostgres
role.JWT_SECRET
: used by PostgREST and GoTrue, among others.SITE_URL
: the base URL of your site.SMTP_*
: mail server credentials. You can use any SMTP server.
Securing the Dashboard#
The Docker setup doesn't include a management database for managing users and logins. If you plan to deploy the Studio to the web we suggest you put it behind a web proxy with Basic Auth or hide it behind a VPN.
Configuration#
Each system can be configured to suit your particular use-case.
To keep the setup simple, we made some choices that may not be optimal for production:
- the database is in the same machine as the servers
- Storage uses the filesystem backend instead of S3
- Auth should be configured with a production-ready SMTP server
Using an external database#
We strongly recommend decoupling your database from docker-compose
before deploying.
The middleware will run with any PostgreSQL database that has logical replication enabled. The following environment variables should be updated
in the .env
file to point to your external database:
POSTGRES_PASSWORD=your-super-secret-and-long-postgres-password
POSTGRES_HOST=db
POSTGRES_DB=postgres
POSTGRES_USER=postgres
POSTGRES_PORT=5432
Once you have done this, you can safely comment out the db
section of the docker-compose
file, and remove any instances where the services depends_on
the db
image.
Supabase services require your external database to be initialized with a specific schema. Refer to our postgres/migrations repository for instructions on running these migrations.
Note that you need superuser permission on the postgres role to perform the initial schema migration. Once completed, the postgres role will be demoted to non-superuser to prevent abuse.
Setting database's log_min_messages
#
By default, docker compose
sets the database's log_min_messages
configuration to fatal
to prevent redundant logs generated by Realtime.
However, you might miss important log messages such as database errors. Configure log_min_messages
based on your needs.
File storage backend on macOS#
By default, Storage backend is set to file
, which is to use local files as the storage backend. To make it work on macOS, you need to choose VirtioFS
as the Docker container file sharing implementation (in Docker Desktop -> Preferences -> General).
Deploying#
See the following guides to deploy Docker Compose setup using your preferred tool and platform:
Next steps#
- Got a question? Ask here.
- Sign in: app.supabase.com