Skip-BART
The description is generated by Grok3.
Model Details
Model Name: Skip-BART
Model Type: Transformer-based model (BART architecture) for automatic stage lighting control
Version: 1.0
Release Date: August 2025
Developers: Zijian Zhao, Dian Jin
Organization: HKUST, PolyU
License: Apache License 2.0
Paper: Automatic Stage Lighting Control: Is it a Rule-Driven Process or Generative Task?
Citation:
@article{zhao2025automatic, title={Automatic Stage Lighting Control: Is it a Rule-Driven Process or Generative Task?}, author={Zhao, Zijian and Jin, Dian and Zhou, Zijing and Zhang, Xiaoyu}, journal={arXiv preprint arXiv:2506.01482}, year={2025} }
Contact: [email protected]
Repository: https://github.com/RS2002/Skip-BART
Model Description
Skip-BART is a transformer-based model built on the Bidirectional and Auto-Regressive Transformers (BART) architecture, designed for automatic stage lighting control. It generates lighting sequences synchronized with music input, treating stage lighting as a generative task. The model processes music data in an octuple format and outputs lighting control parameters, leveraging a skip-connection-enhanced BART structure for improved performance.
- Architecture: BART with skip connections
- Input Format: Encoder input (batch_size, length, 512), decoder input (batch_size, length, 2), attention masks (batch_size, length)
- Output Format: Hidden states of dimension [batch_size, length, 1024]
- Hidden Size: 1024
- Training Objective: Pre-training on music data, followed by fine-tuning for lighting sequence generation
- Tasks Supported: Stage lighting sequence generation
Training Data
The model was trained on the RPMC-L2 dataset:
- Dataset Source: RPMC-L2
- Description: Contains music and corresponding stage lighting data in a format suitable for training Skip-BART.
- Details: Refer to the paper for dataset specifics.
Usage
Installation
git clone https://huggingface.co/RS2002/Skip-BART
Example Code
import torch
from model import Skip_BART
# Load the model
model = Skip_BART.from_pretrained("RS2002/Skip-BART")
# Example input
x_encoder = torch.rand((2, 1024, 512))
x_decoder = torch.randint(0, 10, (2, 1024, 2))
encoder_attention_mask = torch.zeros((2, 1024))
decoder_attention_mask = torch.zeros((2, 1024))
# Forward pass
output = model(x_encoder, x_decoder, encoder_attention_mask, decoder_attention_mask)
print(output.size()) # Output: [2, 1024, 1024]
- Downloads last month
- 6