Dataset Viewer

The dataset viewer is not available because its heuristics could not detect any supported data files. You can try uploading some data files, or configuring the data files location manually.

YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/datasets-cards)

BankNifty Strategy Engine β€” README

A resumable, LLM-driven intraday engine that digests sentiment, expert transcripts, technicals (RSI/MACD), and news to produce trade plans for BankNifty (and Nifty for context). The engine simulates executions on 1-minute data, evaluates P&L, and keeps airtight checkpoints so you can resume exactly where you left off after interruptions.



Features

  • 🧠 LLM-assisted trade plans using structured JSON outputs (strict schema).
  • πŸ“° News-aware decisions (hourly, last 15 mins at close).
  • πŸ“ˆ Technicals: RSI + MACD on hourly/daily series.
  • πŸ§ͺ 1-minute backtest execution with deterministic tiebreak rules.
  • πŸ” Resumable runs via checkpoint (safe to kill & rerun).
  • βœ… Flip/No-trade exit enforcement: if plan flips side or says β€œNo trade” while holding, engine exits at market price.
  • 🧠 Memory string summarizing the last completed trade gets fed back into prompts.
  • πŸ“¦ Excel & Parquet logs for analysis.

Project structure

banknifty_strategy/
β”œβ”€ app/
β”‚  β”œβ”€ __init__.py
β”‚  β”œβ”€ engine.py              # Core loop (09:15 β†’ … β†’ 15:30)
β”‚  β”œβ”€ models.py              # Pydantic v2 models
β”‚  β”œβ”€ prompts.py             # Prompt templates: morning / intrahour / closing
β”‚  β”œβ”€ simulator.py           # simulate_trade_from_signal, slice_intraday
β”‚  β”œβ”€ dataio.py              # load_data() – reads/normalizes data frames
β”‚  β”œβ”€ checkpoint.py          # CheckpointManager – resume & append parquet logs
β”‚  β”œβ”€ logging_setup.py       # Rotating file/logger
β”‚  β”œβ”€ config.py              # AppConfig, Paths, LLMConfig
β”‚  β”œβ”€ llm.py                 # OpenAI client wrapper; strict JSON schema handling
β”‚  β”œβ”€ news.py                # summaries_between() helpers
β”‚  β”œβ”€ utils.py               # hour_passed(), hourly_ohlc_dict(), helpers
β”‚  └─ writer.py              # to_excel_safely()
β”œβ”€ scripts/
β”‚  └─ run_backtest.py        # CLI entrypoint
β”œβ”€ requirements.txt
└─ README.md

Note: Keep app/ a proper package (it must include __init__.py). Always run from the project root so imports like from app.engine import Engine work.


Data inputs & expected columns

Your loader (app/dataio.py) should read and normalize sources. The engine expects these canonical frames and columns:

1. BankNifty hourly (df_bn_hourly)

  • Columns: datetime, open, high, low, close, RSI, MACD_Line, Signal_Line
  • Granularity: hourly (09:15, 10:15, …, 15:15, 15:30)
  • Used for: 09:15 previous indicators, hourly OHLC dicts, close price lookup.

2. BankNifty 1-minute (df_bn_1m)

  • Columns: datetime, open, high, low, close
  • Granularity: 1 min
  • Used by: simulate_trade_from_signal execution windows.

3. Nifty daily or daily-like context (df_nifty_daily)

  • Columns: datetime, open, high, low, close, RSI, MACD_Line, Signal_Line
  • Used for: contextual morning prompt.

4. Sentiment predictions (df_sentiment)

  • Columns: predicted_for (datetime), proposed_sentiment, reasoning

5. Expert transcript (df_transcript)

  • Columns: prediction_for (datetime), Transcript
    • (If your raw file has Prediction_for_date, normalize to prediction_for.)

6. News with summaries (df_news)

  • Columns: datetime_ist (datetime), Article_summary (string)

Ensure all datetime columns are timezone-normalized (naive or same tz) and parsed.


Installation

Python 3.9+ recommended.

pip install -r requirements.txt

Configuration (.env + config classes)

Create a .env in project root:

OPENAI_API_KEY=EMPTY
OPENAI_BASE_URL=http://localhost:8000/v1
OPENAI_MODEL=Qwen/Qwen3-4B

Edit temperature and top_p values from:

  • app/config.py

Running

Use the provided script. Minimal edits added --ckpt-dir (defaults to <out-dir>/checkpoint).

# Always run from project root so "app" package is on sys.path
python -m scripts.run_backtest   --data-dir ./data   --out-dir ./result   --start "2023-12-29 15:15"   --end   "2024-05-01 09:15"

Resumable checkpoints

The engine persists state and logs in Parquet inside --ckpt-dir:

<ckpt-dir>/
β”œβ”€ checkpoint.json           # last_timestamp_processed, state, plans, memory_str
β”œβ”€ trade_log.parquet
β”œβ”€ stats_log.parquet
β”œβ”€ expert_log.parquet
└─ summary_log.parquet

You can kill the process and re-run with the same --ckpt-dir. The engine:

  • Reads checkpoint.json
  • Skips timestamps already processed
  • Continues from the next tick

Excel mirrors (*.xlsx) are written to --out-dir for human inspection.


How the engine works (timeline)

At each timestamp ts in your hourly series:

  1. 09:15 β€” Morning

    • Gathers Nifty/BankNifty previous OHLC + indicators.
    • Pulls sentiment + expert transcript.
    • Calls LLM (schema SummaryMorning) β†’ morning summary.
    • Calls LLM (schema TradePlan) β†’ first plan of the day (but the actual open/close is derived dynamically from previous state; first day has no memory).
  2. 10:15 β€” First intrahour

    • Simulate 1-minute window from last slice start β†’ 10:15 using current plan.
    • Log state change only if it changed by value (not identity).
    • If exited naturally (stop/target), update memory_str, reset state, move last_slice_start.
    • Pull last hour news, OHLC dict, current indicators.
    • Call LLM (DecisionOutput β†’ {summary_banknifty, trade}).
    • Flip/No-trade exit enforcement: if holding and LLM flips side or says β€œNo trade” β†’ force flatten at market price (hourly close).
    • Update logs.
  3. 11:15 β†’ 15:15 β€” Subsequent intrahours

    • Same as 10:15 loop.
  4. 15:30 β€” Close

    • Simulate last 15 minutes (15:15 β†’ 15:30) on 1-minute data.
    • Log state change once; if exited β†’ update memory + reset.
    • LLM close plan (schema TradePlan).
    • If holding and plan flips/no-trade β†’ force flatten at close.
    • Otherwise carry overnight (state remains open).
    • Save checkpoint after each timestamp.

Trade simulation rules

simulate_trade_from_signal(df, trade, dt_col, state, lookback_minutes):

  • Trade schema (TradePlan):
    • status: "Trade" or "No trade"
    • type: "long" | "short" | "none"
    • entry_at, target, stoploss: numbers (positive; 0 if No trade)
  • Entry is limit-style: if entry_at in [low, high] of a 1-min bar β†’ entry fills at entry_at.
  • Exit resolution when both target & stoploss could be hit in same bar: use tiebreaker (engine uses β€œstoploss_first”).
  • P&L (pnl_pct): signed percentage vs entry (long positive if exit > entry; short inverted).
  • Flip/No-trade handling in engine:
    • If open_position and LLM plan flips or says No trade at the tick β†’ force flatten at minutes close (market price) and log memory.

LLM JSON schemas

All Pydantic models enforce extra="forbid" so the model can’t invent fields. The client (app/llm.py) sanitizes the schema name and forces additionalProperties: false at the root and nested objects, satisfying strict servers.

SummaryMorning (example)

class SummaryMorning(BaseModel):
    major_concern_nifty50: str
    trade_reasoning_nifty50: str
    trade_strategy_nifty50: str
    major_concern_banknifty: str
    trade_reasoning_banknifty: str
    trade_strategy_banknifty: str
    model_config = {"extra": "forbid"}

TradePlan

class TradePlan(BaseModel):
    status: Literal["No trade", "Trade"]
    brief_reason: str
    type: Literal["long", "short", "none"]
    entry_at: float
    target: float
    stoploss: float
    model_config = {"extra": "forbid"}

DecisionOutput

class SummaryBankNifty(BaseModel):
    major_concern: str
    sentiment: Literal["bullish", "bearish"]
    reasoning: str
    trade_strategy: str
    news_summary: str
    model_config = {"extra": "forbid"}

class DecisionOutput(BaseModel):
    summary_banknifty: SummaryBankNifty
    trade: TradePlan
    model_config = {"extra": "forbid"}

Outputs

Out dir (--out-dir):

  • stats_log.xlsx – time series of state snapshots/closing stats (one row when state changes; final close row).
  • trade_log.xlsx – model trade plans over time.
  • expert_log.xlsx – morning summaries (one per day).
  • summary_log.xlsx – per-hour summaries.

Checkpoint dir (--ckpt-dir or <out-dir>/checkpoint):

  • Parquets for each log, plus checkpoint.json (state & last timestamp).
Downloads last month
126