Ninety-one intelligence plugins process every bar, tier by tier, from raw indicators to regime models to Smart Money Concepts to AI narrative. Every signal is gated by six-bucket evidence convergence. Every outcome is recorded and fed back. The system doesn't need to be told what works. It learns from what it produces.
Layer 01
Data Foundation
Ingestion · Bar Building · Stream Distribution
< 10ms pipeline latency · feed-provider bound, not processing bound
Layer 02
Mathematical Intelligence
45 plugins
RSI 67.4 · MACD bullish crossover · GARCH: vol elevated · regime: trend
Layer 03
Pattern Intelligence
46 plugins + CIS aggregator
BOS confirmed · unfilled FVG 5235–5238 · CIS +0.71 · CHoCHReversal fired
Layer 04
AI Intelligence
3-tier LLM chain
"Bullish 5m ES setup: trend + SMC confluence. FVG entry 5236, target 5258, stop 5229."
Platform
The Bus is the Contract
IntelligenceEvent
Every tick, every signal, every AI narrative flows through one shared, durable, replayable event bus. No service calls another directly. Producers publish. Consumers subscribe. A new downstream system (execution engine, ML scorer, alert bot) subscribes to the existing stream with zero changes to the producers.
Signal Lifecycle Tracking
8-class outcome
Every signal is tracked from fire to resolution: activation at entry zone, MAE/MFE per bar, 8-class outcome (stopped at entry, stopped in trade, T1/T2/full target, TTL expired). Shadow signals under regime suppression are tracked identically, building a labeled dataset for every market condition the system sees.
Regime-Aware Intelligence
HMM · GARCH · Kalman
I4 classifies the current market regime using GARCH volatility modelling, a Kalman trend filter, HMM hidden state detection, and BOCPD changepoint detection. I7 setup plugins declare a regime_type (trend / mean-reversion / any) and are gated by the slow-clock regime of the next higher timeframe. Suppressed signals become regime_suppressed shadow signals, not dropped data.
Self-Improving Feedback Loop
Outcome → Weights → CIS
Every signal carries the CIS weight version that produced it. Every outcome is written back to the feature store alongside the full I1–I8 signal vector that triggered it. When 30+ samples accumulate per setup type, setup_performance rolls up win rate, avg pnl_r, and Sharpe, feeding back into CIS perf_multiplier weights without code changes.
AI Narrative Synthesis
3-tier LLM chain
I8 runs a 3-tier LLM chain: ZAI GLM-5 (primary, <70ms), OpenRouter with 100+ model fallback, and Ollama local for offline operation. Every signal with confidence > 0.70 gets a per-signal narrative. Every 60s, a 6-group cross-asset synthesis narrative is generated from the full active signal set. All LLM calls, including failures, are logged with outcome back-fill.
Feature Store as Training Set
TimescaleDB · Forever
The intelligence_features hypertable is the ground truth training dataset. Every bar across every instrument and timeframe writes the full I1–I8 feature vector including JSONB tiers. Signal outcomes JOIN back via (symbol, feature_ts, feature_tf). Nothing is discarded. Storage is cheap. Every output is a labeled training sample.
Architecture
DAG · Topological sort · No polling
Every plugin declares its inputs. The engine runs Kahn's topological sort at startup. Execution order is a mathematical property of the dependency graph, not a convention anyone maintains. Cycles hard-crash at startup. Silent corruption is impossible. Adding a plugin means declaring its dependencies; ordering is inferred.
CIS · 6-bucket convergence gate · Evidence required
When 5–8 setup plugins fire on the same bar, CIS decides what gets published and whether anything does. Six buckets (Trend, Momentum, Structure, Pattern, Institutional, Regime) must converge: score > ±0.35 AND 3 of 6 must agree on direction. One dominant bucket cannot override the rest. Discipline enforced by the architecture, not by policy.
Feature store · Outcome tracing · Self-improving
Every signal is tagged with the weight version that produced it. Every outcome (stop hit, target reached, TTL expired) is written back to the feature store with the full signal vector. Nothing is dropped. When outcome data is sufficient, weights update from evidence. The pipeline captures what it produces and learns from what it captured.
IntelligenceEvent schema · Stream-native · API-first
Every output at every tier is encoded into a versioned IntelligenceEvent schema and published to the stream bus. Producers publish. Consumers subscribe. No service calls another directly. A new consumer (alert engine, execution system, ML scorer) subscribes to the existing stream without changing the producers. The bus is the API. Extension is additive.
Real-time signals across futures, forex, and crypto markets