Skip to main content

Installation

This guide walks through cloning the deployment repository, configuring environment variables, and setting up external services before launching the platform.


Step 1: Clone the Repository

git clone https://github.com/zynomilabs/ctms.devops.git && cd ctms.devops

Step 2: Create Environment File

cp .env.example .env.production

Create a convenience symlink:

ln -sf .env.production .env

Step 3: Configure External Services

3.1 Supabase (Authentication)

Self-Hosted Alternative

If you prefer to run Supabase on-premises instead of using Supabase Cloud, see Self-Hosted Vendor Stacks. The ctms-supabase-seed container handles all table creation and migration automatically for both cloud and self-hosted deployments.

  1. Create a project at supabase.com.
  2. Copy the Project URL and Anon key from Settings → API.
  3. Run this migration in the SQL Editor:
CREATE OR REPLACE FUNCTION public.handle_new_user()
RETURNS trigger LANGUAGE plpgsql SECURITY DEFINER AS $$
BEGIN
INSERT INTO public.profiles (id, email) VALUES (NEW.id, NEW.email)
ON CONFLICT (id) DO NOTHING;
RETURN NEW;
END;
$$;

DROP TRIGGER IF EXISTS on_auth_user_created ON auth.users;
CREATE TRIGGER on_auth_user_created
AFTER INSERT ON auth.users FOR EACH ROW
EXECUTE FUNCTION public.handle_new_user();
  1. Create your first user in the Authentication dashboard.

3.2 Frappe Cloud (Clinical Backend)

Self-Hosted Alternative

If you prefer to run Frappe on-premises instead of using Frappe Cloud, see Self-Hosted Vendor Stacks. The setup service automates the wizard, admin user creation, and API token generation.

  1. Create a site at frappecloud.com (e.g. mysite.frappe.cloud).
  2. Go to Settings → API Access and create an API token.
  3. Note the site URL as FRAPPE_URL and the token as FRAPPE_API_TOKEN (token key:secret).

3.3 Analytics Database

Cube.js and ODM API need PostgreSQL with gold-layer data.

OptionDescription
A (Recommended)Use the local lakehouse-db Docker container — set CUBEJS_DB_HOST=lakehouse-db
BUse managed PostgreSQL (Neon, RDS) — set CUBEJS_DB_HOST to your remote host

3.4 OpenAI (AI Features)

  1. Get an API key from platform.openai.com.
  2. Set OPENAI_API_KEY in your environment file.

3.5 Firebase (Optional — Push Notifications)

  1. Create a project at Firebase Console.
  2. Enable Cloud Messaging.
  3. Download the service account key.

Step 4: Edit Environment Variables

Edit .env.production with your values:

# ─── Global ───────────────────────────────────────────────
APP_NAME=zynomi-life-sciences
NODE_ENV=production
DOMAIN=localhost

# ─── Supabase (Auth) ─────────────────────────────────────
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_KEY=your-anon-key
SUPABASE_ANON_KEY=your-anon-key

# ─── Frappe (Clinical Backend) ───────────────────────────
FRAPPE_URL=https://mysite.frappe.cloud
FRAPPE_API_TOKEN="token api_key:api_secret"
FRAPPE_CLOUD_IMAGE_BASE_URL=https://mysite.frappe.cloud

# ─── Analytics DB (local lakehouse) ──────────────────────
CUBEJS_DB_HOST=lakehouse-db
CUBEJS_DB_PORT=5432
CUBEJS_DB_NAME=ctms_dlh
CUBEJS_DB_USER=ctms_user
CUBEJS_DB_PASS=your-db-password
CUBEJS_DB_SSL=false
CUBEJS_DEV_MODE=true
CUBEJS_API_SECRET=change-me-in-production

# ─── AI / MCP ────────────────────────────────────────────
OPENAI_API_KEY=your-openai-api-key
OPENAI_MODEL=gpt-4o-mini

# ─── Data Pipeline ───────────────────────────────────────
TARGET_DB_USER=ctms_user
TARGET_DB_PASSWORD=your-db-password
TARGET_DB_NAME=ctms_dlh
LAKEHOUSE_INSTANCE=your-instance-name
LAKEHOUSE_DB_PORT=5433
INGESTER_IMAGE=zynomi/ctms-ingester:latest
DBT_IMAGE=zynomi/ctms-dbt:latest

# ─── Observability (optional) ────────────────────────────
OPENOBSERVE_ROOT_EMAIL=admin@your-domain.com
OPENOBSERVE_ROOT_PASSWORD=change-me

# ─── EC2 Production (IP-based access) ────────────────────
EC2_PUBLIC_IP=your-ec2-public-ip

For a complete variable reference, see Environment Variables.


Step 5: Build the API Gateway

The KrakenD API Gateway is built locally from the ctms-api-gateway/ directory. All other services use pre-built Docker Hub images.

docker compose --env-file .env.production build api-gateway
tip

Rebuild whenever the KrakenD configuration changes:

docker compose --env-file .env.production build api-gateway && \
docker compose --env-file .env.production up -d api-gateway

Installation Checklist

StepActionStatus
1Clone ctms.devops repository
2Copy .env.example.env.production
3aSet up Supabase project + run migration SQL
3a′(Alternative) Self-host Supabase
3bSet up Frappe Cloud + get API token
3b′(Alternative) Self-host Frappe
3cConfigure analytics database option
3dGet OpenAI API key
4Edit .env.production with all values
5Build KrakenD API Gateway image

Next Steps

  1. Docker Deployment — Launch the platform with Docker Compose (includes EC2 deploy scripts and data pipeline setup)
  2. Initial Setup & Configuration — Provision Frappe with DocTypes, RBAC, and seed data