The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.
BankNifty Strategy Engine β README
A resumable, LLM-driven intraday engine that digests sentiment, expert transcripts, technicals (RSI/MACD), and news to produce trade plans for BankNifty (and Nifty for context). The engine simulates executions on 1-minute data, evaluates P&L, and keeps airtight checkpoints so you can resume exactly where you left off after interruptions.
Features
- π§ LLM-assisted trade plans using structured JSON outputs (strict schema).
- π° News-aware decisions (hourly, last 15 mins at close).
- π Technicals: RSI + MACD on hourly/daily series.
- π§ͺ 1-minute backtest execution with deterministic tiebreak rules.
- π Resumable runs via checkpoint (safe to kill & rerun).
- β Flip/No-trade exit enforcement: if plan flips side or says βNo tradeβ while holding, engine exits at market price.
- π§ Memory string summarizing the last completed trade gets fed back into prompts.
- π¦ Excel & Parquet logs for analysis.
Project structure
banknifty_strategy/
ββ app/
β ββ __init__.py
β ββ engine.py # Core loop (09:15 β β¦ β 15:30)
β ββ models.py # Pydantic v2 models
β ββ prompts.py # Prompt templates: morning / intrahour / closing
β ββ simulator.py # simulate_trade_from_signal, slice_intraday
β ββ dataio.py # load_data() β reads/normalizes data frames
β ββ checkpoint.py # CheckpointManager β resume & append parquet logs
β ββ logging_setup.py # Rotating file/logger
β ββ config.py # AppConfig, Paths, LLMConfig
β ββ llm.py # OpenAI client wrapper; strict JSON schema handling
β ββ news.py # summaries_between() helpers
β ββ utils.py # hour_passed(), hourly_ohlc_dict(), helpers
β ββ writer.py # to_excel_safely()
ββ scripts/
β ββ run_backtest.py # CLI entrypoint
ββ requirements.txt
ββ README.md
Note: Keep
app/a proper package (it must include__init__.py). Always run from the project root so imports likefrom app.engine import Enginework.
Data inputs & expected columns
Your loader (app/dataio.py) should read and normalize sources. The engine expects these canonical frames and columns:
1. BankNifty hourly (df_bn_hourly)
- Columns:
datetime,open,high,low,close,RSI,MACD_Line,Signal_Line - Granularity: hourly (09:15, 10:15, β¦, 15:15, 15:30)
- Used for: 09:15 previous indicators, hourly OHLC dicts, close price lookup.
2. BankNifty 1-minute (df_bn_1m)
- Columns:
datetime,open,high,low,close - Granularity: 1 min
- Used by: simulate_trade_from_signal execution windows.
3. Nifty daily or daily-like context (df_nifty_daily)
- Columns:
datetime,open,high,low,close,RSI,MACD_Line,Signal_Line - Used for: contextual morning prompt.
4. Sentiment predictions (df_sentiment)
- Columns:
predicted_for(datetime),proposed_sentiment,reasoning
5. Expert transcript (df_transcript)
- Columns:
prediction_for(datetime),Transcript- (If your raw file has
Prediction_for_date, normalize toprediction_for.)
- (If your raw file has
6. News with summaries (df_news)
- Columns:
datetime_ist(datetime),Article_summary(string)
Ensure all datetime columns are timezone-normalized (naive or same tz) and parsed.
Installation
Python 3.9+ recommended.
pip install -r requirements.txt
Configuration (.env + config classes)
Create a .env in project root:
OPENAI_API_KEY=EMPTY
OPENAI_BASE_URL=http://localhost:8000/v1
OPENAI_MODEL=Qwen/Qwen3-4B
Edit temperature and top_p values from:
app/config.py
Running
Use the provided script. Minimal edits added --ckpt-dir (defaults to <out-dir>/checkpoint).
# Always run from project root so "app" package is on sys.path
python -m scripts.run_backtest --data-dir ./data --out-dir ./result --start "2023-12-29 15:15" --end "2024-05-01 09:15"
Resumable checkpoints
The engine persists state and logs in Parquet inside --ckpt-dir:
<ckpt-dir>/
ββ checkpoint.json # last_timestamp_processed, state, plans, memory_str
ββ trade_log.parquet
ββ stats_log.parquet
ββ expert_log.parquet
ββ summary_log.parquet
You can kill the process and re-run with the same --ckpt-dir. The engine:
- Reads
checkpoint.json - Skips timestamps already processed
- Continues from the next tick
Excel mirrors (*.xlsx) are written to --out-dir for human inspection.
How the engine works (timeline)
At each timestamp ts in your hourly series:
09:15 β Morning
- Gathers Nifty/BankNifty previous OHLC + indicators.
- Pulls sentiment + expert transcript.
- Calls LLM (schema SummaryMorning) β morning summary.
- Calls LLM (schema TradePlan) β first plan of the day (but the actual open/close is derived dynamically from previous state; first day has no memory).
10:15 β First intrahour
- Simulate 1-minute window from last slice start β 10:15 using current plan.
- Log state change only if it changed by value (not identity).
- If exited naturally (stop/target), update memory_str, reset state, move
last_slice_start. - Pull last hour news, OHLC dict, current indicators.
- Call LLM (DecisionOutput β
{summary_banknifty, trade}). - Flip/No-trade exit enforcement: if holding and LLM flips side or says βNo tradeβ β force flatten at market price (hourly close).
- Update logs.
11:15 β 15:15 β Subsequent intrahours
- Same as 10:15 loop.
15:30 β Close
- Simulate last 15 minutes (15:15 β 15:30) on 1-minute data.
- Log state change once; if exited β update memory + reset.
- LLM close plan (schema TradePlan).
- If holding and plan flips/no-trade β force flatten at close.
- Otherwise carry overnight (state remains open).
- Save checkpoint after each timestamp.
Trade simulation rules
simulate_trade_from_signal(df, trade, dt_col, state, lookback_minutes):
- Trade schema (
TradePlan):status:"Trade"or"No trade"type:"long" | "short" | "none"entry_at,target,stoploss: numbers (positive; 0 ifNo trade)
- Entry is limit-style: if
entry_atin[low, high]of a 1-min bar β entry fills atentry_at. - Exit resolution when both target & stoploss could be hit in same bar: use tiebreaker (engine uses βstoploss_firstβ).
- P&L (
pnl_pct): signed percentage vs entry (longpositive ifexit > entry;shortinverted). - Flip/No-trade handling in engine:
- If open_position and LLM plan flips or says No trade at the tick β force flatten at minutes close (market price) and log memory.
LLM JSON schemas
All Pydantic models enforce extra="forbid" so the model canβt invent fields.
The client (app/llm.py) sanitizes the schema name and forces additionalProperties: false at the root and nested objects, satisfying strict servers.
SummaryMorning (example)
class SummaryMorning(BaseModel):
major_concern_nifty50: str
trade_reasoning_nifty50: str
trade_strategy_nifty50: str
major_concern_banknifty: str
trade_reasoning_banknifty: str
trade_strategy_banknifty: str
model_config = {"extra": "forbid"}
TradePlan
class TradePlan(BaseModel):
status: Literal["No trade", "Trade"]
brief_reason: str
type: Literal["long", "short", "none"]
entry_at: float
target: float
stoploss: float
model_config = {"extra": "forbid"}
DecisionOutput
class SummaryBankNifty(BaseModel):
major_concern: str
sentiment: Literal["bullish", "bearish"]
reasoning: str
trade_strategy: str
news_summary: str
model_config = {"extra": "forbid"}
class DecisionOutput(BaseModel):
summary_banknifty: SummaryBankNifty
trade: TradePlan
model_config = {"extra": "forbid"}
Outputs
Out dir (--out-dir):
stats_log.xlsxβ time series of state snapshots/closing stats (one row when state changes; final close row).trade_log.xlsxβ model trade plans over time.expert_log.xlsxβ morning summaries (one per day).summary_log.xlsxβ per-hour summaries.
Checkpoint dir (--ckpt-dir or <out-dir>/checkpoint):
- Parquets for each log, plus
checkpoint.json(state & last timestamp).
- Downloads last month
- 126