Skip to main content

Infrastructure Requirements

This document outlines the infrastructure requirements for deploying and running the Zynomi platform.

Deployment Architecture


Architecture Philosophy

Zynomi is built on a simple, modern technology stack. While there are multiple components, each serves a specific purpose in delivering a comprehensive clinical trial platform.

PrincipleDescription
Loosely CoupledComponents are independent and can be updated separately
Plugin-BasedFeatures can be added without modifying core code
Horizontal ScalingNew capabilities added alongside existing ones
Microfrontend ReadyAI Chatbot installs as standalone widget
Competitive Advantage

Our strong architecture foundation enables rapid feature development and easy scaling. The platform is designed for extensibility without compromising stability.


Deployment Options

Zynomi supports two deployment models based on data residency and compliance requirements.

OptionDescriptionBest For
Serverless (Recommended)Fully managed cloud servicesMost deployments, rapid scaling
On-PremisesSelf-hosted infrastructureStrict data residency requirements
Recommendation

We highly recommend serverless deployment for simplified operations, automatic scaling, and reduced maintenance overhead. On-premises is available for organizations where data must not leave their infrastructure.


Cloud Infrastructure

Core Services

ServiceProviderPurposeNotes
Web App and Mobile WebsiteVercelNext.js hosting, edge functions, REST APIsServerless, auto-scaling
Healthcare BackendFrappe CloudEHR, patient data, API backendManaged by ERPNext, subscription-based
DatabaseSupabasePostgreSQL, authentication, real-timeIncluded with Vercel integration
Cache StoreUpstash / RedisCube.dev cache, session managementIncluded with Vercel; in-memory for small deployments
Push NotificationsFirebase (Google Cloud)Mobile push notificationsFree tier available
Source ControlGitHubCode repository, CI/CDFree tier available

Analytics Infrastructure

ServiceProviderPurposeNotes
Data Ingestiondlthub (Python)OLTP to lakehouse pipelinePython-based ETL
Data TransformationdbtMedallion architecture, data martsOSS Core edition; Cloud available for scaling
Semantic LayerCube.devGoverned metrics, REST/SQL APIsOSS edition on Fly.io; Cloud available for enterprise
Data LakePostgreSQL (Supabase)Default for small deploymentsCan swap to Snowflake, Iceberg, Databricks
Container HostingFly.ioServerless containers for dbt and CubePay-as-you-go

AI and Integration Services

ServiceProviderPurposeNotes
API GatewayKrakenDUnified API endpoint, rate limitingSelf-hosted or Fly.io
MCP ServerCustom BuiltSemantic Layer as MCP toolsAdheres to Cube.dev API/SDK standards
AI Agent / ChatbotCustom BuiltAgentic analytics interfaceNot Cube.dev Cloud (enterprise-only)
LLM ProviderOpenAI (Default)GPT-4.0 Mini for tool selectionAny commercial or OSS LLM supported
LLM Requirements

An LLM API subscription is required for the AI chatbot. Default is GPT-4.x Mini. Better models improve tool selection accuracy. No hallucination risk as responses are grounded via MCP tool-calling.


Mobile Application

PlatformTechnologyDescription
iOSNative ShellWebView wrapper loading Ionic mobile website
AndroidNative ShellWebView wrapper loading Ionic mobile website
Mobile WebsiteIonic + Vue.jsProgressive web app hosted on Vercel

Security and Compliance

All services are configured with enterprise-grade security controls.

CapabilityDescription
Audit TrailComplete activity logging across all services
Data at RestEncrypted storage for all databases and files
Data in TransitTLS 1.3 for all communications
HIPAAHealthcare data protection compliance
GDPRData privacy and protection compliance
SOC 2Service organization controls

Development Environment

Hardware Requirements

SpecificationMinimumRecommended
RAM8 GB16 GB
CPU4 cores8 cores
Storage50 GB SSD100 GB SSD

Technology Stack

The platform is built with modern, widely-adopted technologies.

CategoryTechnologiesNotes
Primary LanguagesTypeScript, JavaScript70% of codebase
Backend LanguagesPython, SQLdbt, data ingestion, analytics
ScriptingShell/BashAutomation and deployment
Frontend FrameworkReact, Next.js 14Web application
Mobile FrameworkReact NativeMobile application
StylingTailwind CSS, shadcn/uiComponent library
ContainerizationDockerDevelopment and deployment

Software Requirements

SoftwareVersionRequiredNotes
Node.js20.x LTSYesJavaScript runtime
BunLatestYesPackage manager (preferred over npm)
Python3.11+Yesdbt, data ingestion
Git2.xYesVersion control
Docker20.x+YesContainer runtime
VS CodeLatestRecommendedIDE with extensions

Supported Operating Systems

OSVersion
macOS12 (Monterey) or later
Ubuntu20.04 LTS or later
Windows10/11 with WSL2

Network Requirements

Ports

PortServiceDirection
3000Next.js dev serverInbound
4000Cube.dev playgroundInbound
5432PostgreSQLOutbound
443HTTPS APIsOutbound

Firewall Allowlist

Domain PatternService
*.vercel.appWeb hosting
*.supabase.coDatabase
*.frappe.cloudBackend
*.fly.devContainer hosting
*.googleapis.comFirebase
github.comSource control
api.openai.comLLM (default)

Scaling Options

ComponentDefaultScaled Option
dbtOSS Core on Fly.iodbt Cloud
Cube.devOSS on Fly.ioCube Cloud
CacheUpstash / In-memoryDedicated Redis
DatabaseSupabase (PostgreSQL)Dedicated PostgreSQL
Data LakePostgreSQLSnowflake, Iceberg, Databricks

Microfrontend Architecture

The AI Chatbot is built as a microfrontend widget that can be installed independently.

FeatureDescription
Standalone WidgetDeploys separately from host application
No Code ChangesIntegrates without modifying main codebase
Independent UpdatesCan be versioned and updated separately
EmbeddableWorks in any web application

Roadmap

FeatureStatus
Internationalization (i18n)Planned
WCAG AccessibilityPlanned