Prerequisites
- Docker and Docker Compose
- An OpenAI API key (or compatible LLM provider)
Quick start
- PostgreSQL 16 on port 5432
- Redis 7 on port 6379
- Mengram API on port 8420
Verify it’s running
Configuration
Environment variables indocker-compose.yml:
| Variable | Default | Description |
|---|---|---|
DATABASE_URL | postgresql://mengram:mengram@postgres:5432/mengram | PostgreSQL connection string |
REDIS_URL | redis://redis:6379 | Redis connection string |
OPENAI_API_KEY | required | OpenAI API key for embeddings and LLM |
LLM_PROVIDER | openai | LLM provider (openai, anthropic, ollama) |
LLM_MODEL | gpt-4o-mini | Model for memory extraction |
Use with Ollama (fully local)
Run Mengram with a local LLM — no external API calls:Use with LM Studio
Data persistence
PostgreSQL data is stored in a Docker volume (pgdata). Your memories persist across container restarts.
To backup:
Production deployment
For production, consider:- Use a managed PostgreSQL (e.g., Supabase, Neon, RDS)
- Use a managed Redis (e.g., Upstash, ElastiCache)
- Set strong database passwords
- Put behind a reverse proxy with TLS (nginx, Caddy)
- Set
GUNICORN_WORKERSfor concurrency