Thesis AI logoThesis AI/Docs

System

System Architecture

Thesis AI is a full-stack platform built on three primary layers: an API layer that handles requests and orchestration, an AI layer that runs multi-agent research, and a data layer that manages market data, user state, and caching.

High-Level Overview

┌─────────────────────────────────────────────────────┐ │ Mobile App (iOS) │ │ Expo · React Native · TypeScript │ └──────────────────────┬──────────────────────────────┘ │ HTTPS / SSE streaming ┌──────────────────────▼──────────────────────────────┐ │ API Layer │ │ FastAPI · Python 3.11 │ │ /v1/ai/chat · /v1/market-data · /v1/portfolios │ └─────────────┬───────────────────────────────────────┘ │ ┌─────────────▼───────────────────────────────────────┐ │ AI Layer │ │ Investment Manager (Orchestrator) │ │ │ │ ┌─────────┐ ┌──────────┐ ┌──────┐ ┌───────┐ │ │ │ Macro │ │ Fundmtl │ │ News │ │ Price │ │ │ │ Agent │ │ Agent │ │Agent │ │ Agent │ │ │ └─────────┘ └──────────┘ └──────┘ └───────┘ │ │ │ │ Evidence Layer · Synthesis │ └──────────────────────┬──────────────────────────────┘ │ ┌──────────────────────▼──────────────────────────────┐ │ Data Layer │ │ PostgreSQL (pgvector) · Redis · Celery Workers │ └────────────────┬────────────────────────────────────┘ │ ┌────────────────▼────────────────────────────────────┐ │ External Integrations │ │ Massive API · FRED API · Anthropic API │ └─────────────────────────────────────────────────────┘

API Layer

The backend is a FastAPI application running on Python 3.11. It exposes a versioned REST API at /v1/ with streaming support via Server-Sent Events (SSE) for real-time AI output.

Key Endpoints

EndpointDescription
POST /v1/ai/chatSubmit a research query. Streams the AI response via SSE.
GET /v1/market-data/:symbolFetch real-time quote, OHLC bars, and fundamentals for a symbol.
GET /v1/macro/snapshotReturns the current macro snapshot: Fed rate, CPI, unemployment, yields.
GET /v1/dashboardAggregated portfolio and watchlist context for the authenticated user.
GET /v1/insightsCached insight cards for the user's holdings and watchlist.
GET /v1/newsRecent headlines filtered to the user's watchlist and portfolio symbols.

AI Layer

The AI layer is the core of Thesis. When a research query arrives, an Investment Manager orchestrator selects the appropriate specialist agents, runs them (in parallel where possible), aggregates their outputs into an evidence layer, and synthesizes a final thesis.

The LLM backend uses Anthropic Claude (claude-sonnet-4-6) by default. For local development, Ollama serves as a drop-in fallback, enabling offline iteration without API costs.

LLM Routing

If ANTHROPIC_API_KEY is set, the system uses Claude. Otherwise it falls back to Ollama at http://localhost:11434. The model, context window, and prompt structure are identical in both modes.

Data Layer

PostgreSQL with pgvector

The primary database is PostgreSQL 16 with the pgvector extension for semantic vector embeddings. It stores user accounts, watchlists, portfolio holdings, and generated insight cards with source citations.

Redis

Redis serves two purposes: an in-memory cache for market data responses (reducing vendor API calls) and the message broker for Celery background task workers.

Celery Workers

Background jobs — including scheduled macro snapshot refreshes and async insight generation — are managed by Celery with a Redis broker and a beat scheduler for periodic tasks.

Tech Stack Summary

LayerTechnology
Mobile AppExpo · React Native 0.81 · TypeScript · Expo Router
Web / MarketingNext.js 14 · React 19 · TypeScript
APIFastAPI · Uvicorn · Pydantic 2 · SQLAlchemy 2 (async)
AI / LLMAnthropic Claude (primary) · Ollama (local fallback)
DatabasePostgreSQL 16 + pgvector
Cache / QueueRedis · Celery
Market DataMassive API · FRED API
InfrastructureDocker · docker-compose · Terraform
Monorepopnpm workspaces