Skip to content

AlphaForge – Core Component View (C3)

Source: Notion | Last edited: 2025-11-21 | ID: 2b22d2dc-3ef...


This view zooms into the core services inside AlphaForge / QuantOS:

  • DSL and Compiler Service
  • Orchestrator Service
  • Data and Feature Service
  • Execution Gateway
  • Experiment and Metric Store You can turn each section into a toggle block in Notion if you want a progressive-disclosure layout.

DSL Parser

  • Parses strategy DSL specs (YAML / JSON / custom syntax).

  • Produces an initial AST with syntactic and basic semantic checks. IR Builder

  • Converts AST into an intermediate representation (IR).

  • IR is DAG-friendly and explicit about:

    • Data dependencies
    • Execution steps Canonicalizer
  • Normalizes IR into a canonical form:

    • Deterministic node ordering
    • Normalized parameter representation
    • Removal of no-ops and redundant nodes where possible Fingerprinter
  • Computes stable identifiers for:

    • Entire strategies and experiments
    • Individual DAG nodes (data transforms, features, signals, portfolio steps)
  • Used for:

    • Run de-duplication
    • Caching
    • Lineage tracking Vectorizer
  • Embeds canonical IR into a vector space.

  • Enables:

    • Similarity search
    • Clustering
    • Novelty scoring for strategies and experiments Plugin Resolver
  • Resolves DSL references to concrete plugin implementations from capability packs.

  • Validates compatibility:

    • Data types
    • Frequencies
    • Venues
    • Required parameters

DSL and Compiler – Internal Flow (Optional Diagram)

Section titled “DSL and Compiler – Internal Flow (Optional Diagram)”

In Notion, add a Code block, set language to mermaid, and paste:

flowchart LR
DSL[DSL Spec]
PARSER[DSL Parser]
AST[AST]
IRB[IR Builder]
IR[Intermediate Representation]
CAN[Canonicalizer]
CIR[Canonical IR]
FP[Fingerprinter]
VE[Vectorizer]
PLUG[Plugin Resolver]
DSL --> PARSER --> AST --> IRB --> IR --> CAN --> CIR
CIR --> FP
CIR --> VE
CIR --> PLUG

Run Scheduler

  • Accepts:

    • Backtest requests
    • Simulation runs
    • Paper-trading runs
    • Live-run requests
  • Allocates compute resources and schedules jobs. DAG Executor

  • Executes compiled DAGs stage by stage:

    • Data loading
    • Feature and factor computation
    • Signal generation
    • Portfolio and risk logic
    • Metrics and reporting
  • Interacts with:

    • Data and Feature Service
    • Execution Gateway Run State Manager
  • Tracks:

    • Job status and progress
    • Logs
    • Intermediate artifacts
  • Handles:

    • Retries
    • Cancellations
    • Failure handling logic Event Bus and Queue Adapter
  • Connects to underlying messaging systems

    • e.g. Kafka, RabbitMQ, Redis streams, etc.
  • Carries:

    • Run events
    • Status updates
    • Signals between components Policy Engine
  • Enforces platform-level policies:

    • Resource quotas per user or project
    • SLAs and priorities
    • Guardrails for risk and exposure in live runs

Ingestion Pipelines

  • Connectors for:

    • Exchanges
    • Data vendors
    • On-chain data sources
  • Normalize and validate incoming data into internal schemas.

  • Handle:

    • Late data
    • Corrections
    • Backfills Storage Layout Manager
  • Manages how data is persisted:

    • ClickHouse tables for OHLCV, trades, factor panels
    • Object storage for raw, unstructured, and large artifacts
  • Designs:

    • Partitions
    • Sort keys
    • Indices for efficient queries Query Planner
  • Translates high-level requests from DSL / IR into concrete queries.

  • Handles:

    • Universe selection
    • Time windows
    • Frequency alignment
    • Joins across datasets Feature Generator
  • Runs feature and factor pipelines defined in DSL or capability packs.

  • Produces factor and feature panels used by:

    • Backtests
    • Live strategies Caching Layer
  • Caches frequently used derived datasets and panels.

  • Integrates with the Fingerprinter to use structural identity as cache keys.


Order Router

  • Accepts abstract orders:

    • Side, size
    • Constraints
    • Venue preferences
  • Routes orders to the appropriate venue adapter or execution engine. Venue Adapters

  • One adapter per venue or engine:

    • e.g. Binance, OKX, Hyperliquid, Nautilus
  • Translates AlphaForge orders into venue-specific API calls.

  • Maps responses:

    • Fills
    • Errors
    • Rejections back to a common internal format.

Risk and Pre-Trade Checks

  • Performs platform-level risk checks before sending orders:

    • Max notional per symbol or portfolio
    • Leverage and margin constraints
    • Kill switches and circuit breakers State Synchronizer
  • Maintains a consistent view of:

    • Positions
    • Balances
    • Open orders
  • Feeds this state back to:

    • Orchestrator Service
    • Data and Feature Service

Experiment and Metric Store – Components

Section titled “Experiment and Metric Store – Components”

Experiment and Metric Store – Components

Section titled “Experiment and Metric Store – Components”

Experiment Registry

  • Stores experiment definitions and metadata.

  • Links to:

    • DSL specs
    • Compiler fingerprints
    • Configuration details
  • Supports search by:

    • Tags
    • Owner
    • Strategy type
    • Other filters Run Log Store
  • Persists detailed run logs and artifacts:

    • Equity curves
    • Drawdown series
    • Factor exposures and diagnostics
  • Exposes them via APIs for:

    • Dashboards
    • Notebooks Metrics Engine
  • Computes standard metrics:

    • Sharpe, Sortino
    • Drawdowns
    • Hit rates
    • Turnover
    • Etc.
  • Supports custom metrics via plugins. Similarity and Novelty Engine

  • Uses vector embeddings from the Vectorizer (compiler) to:

    • Find similar strategies and experiments
    • Identify clusters and overlaps
    • Suggest under-explored regions of the strategy space

  • DSL and Compiler Turn “what we want to try” into canonical, comparable, searchable plans.

  • Orchestrator Turn those plans into executed runs, safely and reproducibly.

  • Data and Feature Service Ensure every run sees correct, consistent, well-defined data.

  • Execution Gateway Connect research logic to real markets through pluggable engines.

  • Experiment and Metric Store Remember everything that has been tried and make it navigable for humans and agents.