This model uses Mamba Architecture trained on a research abstract dataset.

  • Optimizer: AdamW
  • Leanring Rate: 0.001

Import the scripts from the code folder

from model import Mamba, ModelArgs

Loading Model

mamba_model = Mamba.from_pretrained("pt-sk/mamba").to("cuda")

Loading Tokenizer

tokenizer = AutoTokenizer.from_pretrained('pt-sk/mamba')

mamba_reserach file contains the state dict of optimizer and the model.

Downloads last month
32
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Dataset used to train pt-sk/mamba_ml_abstract