- Base Model Interface — Defines how your Coordinator Node interacts with Cruncher submissions
- Scoring Function — Provides a scoring function that participants can use for local testing
- Quickstarters — Notebooks that help participants join your challenge quickly
- Data — A dataset participants can use locally to test their models (both running and scoring)
- Implement the official interface
- Access helper utilities (e.g., data access)
- Run scoring locally
You will also need this PyPI package to allow Crunchers to import the Base Model Interface inside
the quickstarters.
Example: Condor Game
We’ll learn how to set up a public repository by examining the Condor Game implementation.Condor Game Public Repository
Public entry point for the Condor Game challenge
Model Interface
Participants build aTracker that implements the TrackerBase interface:
density_pdf specification, ensuring a strict and
standardized interface.
Data
To simplify local development, the Condor Game ecosystem provides two helper utilities: These functions fetch historical prices from a remote HTTP service at 1-minute resolution. Participants can load a time range and train or validate locally without needing to build their own data pipeline first.Scoring
The repository includes a local evaluation workflow throughTrackerEvaluator, which tracks:
- Overall likelihood score
- Recent likelihood score
Quickstarters and Examples
In thecondorgame/examples directory you can find:
- Quickstarter notebooks — Get started quickly with a working baseline
- Self-contained examples — Ready to copy and adapt for your own implementation