Spaces:
Running
on
Zero
Apply for community grant: Personal project (gpu)
Project: Interactive Demo for SARM-4B Interpretable Reward Model
This is a request for a GPU grant for our Hugging Face Space, which hosts the official interactive demo for our SARM-4B model. SARM (Sparse Autoencoder-enhanced Reward Model) is a novel architecture introduced in our research paper, "Interpretable Reward Model via Sparse Autoencoder" (arXiv:2508.08746), aimed at improving the transparency and interpretability of reward models used in LLM alignment.
Our work makes reward models less of a "black box" by allowing for feature-level analysis of AI preferences. Importantly, our SARM-4B model achieves state-of-the-art performance on the RewardBench 2 benchmark, demonstrating that interpretability can be achieved without sacrificing performance.
The SARM-4B model is based on Llama-3.1-8B and is over 9GB in size. Running inference on a CPU is extremely slow, leading to a poor user experience and making it difficult for the community to explore our research. A ZeroGPU grant would allow us to provide a fast, responsive demo, making our open-source contribution to AI alignment much more accessible to researchers and developers.
Thank you for your consideration and support for the open-source AI community.
Hi
@Schrieffer
, we've assigned ZeroGPU to this Space. Please check the compatibility and usage sections of this page so your Space can run on ZeroGPU.
If you can, we ask that you upgrade to Pro ($9/month) to enjoy higher ZeroGPU quota and other features like Dev Mode, Private Storage, and more: hf.co/pro
Hi @Schrieffer , we've assigned ZeroGPU to this Space. Please check the compatibility and usage sections of this page so your Space can run on ZeroGPU.
If you can, we ask that you upgrade to Pro ($9/month) to enjoy higher ZeroGPU quota and other features like Dev Mode, Private Storage, and more: hf.co/pro
@hysts thank you so much! I'm excited to use ZeroGPU. I'll check the compatibility guide right away. Also, thanks for the suggestion about upgrading to Pro.