YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

CSI-BERT2

The description is generated by Grok3.

Model Details

Model Description

CSI-BERT2 is an upgraded BERT-inspired transformer model for Channel State Information (CSI) prediction and classification in wireless communication and sensing. It improves upon CSI-BERT with optimized model and code structure, supporting tasks like CSI recovery, prediction, gesture recognition, fall detection, people identification, and people number estimation. The model processes CSI amplitude data and supports adversarial training with a GAN-based discriminator.

  • Architecture: BERT-based transformer with optional GAN discriminator
  • Input Format: CSI amplitude (batch_size, length, receiver_num * carrier_dim), attention mask (batch_size, length), optional timestamp (batch_size, length)
  • Output Format: Hidden states of dimension [batch_size, length, hidden_dim]
  • Hidden Size: 128
  • Training Objective: MLM pre-training with GAN (optional) and task-specific fine-tuning
  • Tasks Supported: CSI recovery, CSI prediction, CSI classification

Training Data

The model was trained on the following datasets:

  • Public Datasets:
    • WiGesture: Gesture recognition, people identification
    • WiFall: Action recognition, fall detection, people identification
  • Proposed Dataset:
  • Data Structure:
    • Amplitude: (batch_size, length, receiver_num * carrier_dim)
    • Timestamp: (batch_size, length) (optional)
    • Label: (batch_size)
  • Note: Refer to CSI-BERT for data preparation details. Custom dataloaders may be needed for specific tasks.

Usage

Installation

git clone https://huggingface.co/RS2002/CSI-BERT2

Example Code

import torch
from model import CSI_BERT2
model = CSI_BERT2.from_pretrained("RS2002/CSI-BERT2")
csi = torch.rand((2, 100, 52))
time_stamp = torch.rand((2, 100))
attention_mask = torch.zeros((2, 100))
y = model(csi,time_stamp,attention_mask)
print(y.shape) # dim: [2,100,128] (batch_size,length,hidden_dim)
Downloads last month
72
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support