Back to Portfolio

Janus Trading System

A pocket hedge fund

August 2025 to Present Active development

Janus is a modular trading system that combines news analysis, machine learning predictions, and price action based trading strategies into a unified signal pipeline. The central idea is to test whether combining event driven information with quantitative market structure can produce stronger trading decisions than relying on either one alone.

System Architecture

Interactive architecture map

This embedded view shows the current Janus system flow, including data ingestion, feature enrichment, strategy logic, execution, and evaluation.

Project Overview

Janus is designed as a modular algorithmic trading research system. At its core, it acts as an orchestrated pipeline intended to evaluate how different types of market information interact. The system runs price ingestion, news scoring, signal generation, and execution logic through a structured framework built for repeatable testing.

Rather than relying solely on pure mathematical models or isolated sentiment feeds, the pipeline is built to combine event driven news signals with quantitative trading strategies and machine learning layers. This structure enables a research workflow where independent signal layers can be evaluated on their own and then tested together in composite strategies.

Problem and Motivation

A persistent question in the trading algorithm space is whether markets are better modeled through pure quantitative rules, through event driven interpretation of news, or through a hybrid approach. Many existing systems tend to optimize heavily for just one paradigm.

Janus is built around the premise that both math and narrative drive price action, and that an informed system should structurally incorporate both. The engineering motivation is to build an environment that can cleanly capture disparate inputs such as time series features and news sentiment, fuse them probabilistically, and route them through logic gates to output controlled execution commands.

System Architecture

The architecture follows a deterministic data flow pipeline to isolate concerns between ingestion, analysis, and execution.

01
Data Ingestion & Storage

Market and news data are captured and stored locally using DuckDB and Parquet for rapid sequential reading.

02
Feature Enrichment

Raw time series and text data are processed into distinct numerical features representing market structure and news dimensions.

03
Strategy & Signal Orchestration

Independent predictive layers generate signals which are then combined in a confluence strategy to filter and weight potential trades.

04
Execution

Signals are passed to an execution routine responsible for risk sizing, trade logging, and portfolio tracking before connecting to external APIs.

05
Backtesting

A custom backtesting pipeline mirrors the live execution path, enabling strategy validation over historical data slices.

Key Technical Components

News Scoring Layer

A processing layer that blends sentiment, relevance, novelty, velocity, surprise, and scale into probabilistic signals to quantify event driven impacts.

Pattern Forecasting Layer

An initial predictive module that estimates short horizon price direction based on engineered time series features and classical market structure.

Confluence Strategy Layer

The core logic center that combines the market structure signals with machine learning outputs. Multiple strategy variants currently exist to test different combination weights.

Temporal Convolutional Network

Integration in progress. An advanced modeling component being developed to capture longer term pattern dependencies within the time series data.

Risk Map

Development stage. A system intended to dynamicially overlay overall market volatility and regime shifts onto individual trade sizing decisions.

Bayesian Optimization

Development stage. A framework to systematically tune the parameters of the strategy and forecasting layers without purely brute forcing historical bounds.

Backtesting Pipeline

A custom built evaluation tool tied directly to the broader system design, enabling repeatable performance testing across different strategy implementations.

Execution Engine

An operation layer handling trade logging, position checking, and profit mapping, currently configured for structured paper execution.

Engineering Challenges

The primary difficulty in this project is not solely writing a predictive model, but managing the complexity of joining event driven news signals with market structure. Building a coherent system requires strict orchestration and modularity so that data flows deterministically from ingestion to decision.

Ensuring that the pipeline is robust enough for repeatable backtesting meant the logic routing had to be completely decoupled from the execution handler. This isolation allows individual components to evolve over time without breaking the core system architecture, maintaining a controlled environment for signal fusion research.

Current Status

  • System successfully connected to the Alpaca API
  • Core pipeline assembled and data flow validated
  • System is ready to begin paper trading in its current state
  • Signal blending and strategy logic remain under active validation and refinement

Next Steps

Validate and improve the probability outputs of the news scoring layer

Update and expand the range of event driven news sources

Run Janus for one full week on a paper trading account to verify execution logging

Continue iterative improvements on the Temporal Convolutional Network, risk map, and optimization framework

Private repo, details available on request