Upload modular_isaac.py

#4
by merve HF Staff - opened

Hello and congrats for the release!
This PR makes this model load with no additional dependency, only transformers, which is very convenient for the users:

from transformers import AutoTokenizer, AutoConfig, AutoModelForCausalLM, AutoProcessor

tokenizer = AutoTokenizer.from_pretrained("Perceptron/Isaac-0.1", trust_remote_code=True, use_fast=False)
config = AutoConfig.from_pretrained("Perceptron/Isaac-0.1", trust_remote_code=True)
processor = AutoProcessor.from_pretrained("Perceptron/Isaac-0.1", tokenizer=tokenizer, config=config)
model = AutoModelForCausalLM.from_pretrained("Perceptron/Isaac-0.1", trust_remote_code=True)

you can also add a small inference notebook I made by replacing the username to Perceptron: https://colab.research.google.com/drive/1BHl2ZT8cYZ0HlP_q4HllFuCXWIBX_R_2?usp=sharing

if you add the "notebook.ipynb" repo to it's one-click open in the repository, making it easier for people to try out your model as well!

Perceptron AI org

What is your recommendation around this for the core transformers repo: https://github.com/huggingface/transformers/pull/40962

TensorStream is a core abstraction for us which we will continue to optimize and improve - our intention of keeping it in the perceptron package was to make it easier to centralize improvements across open code bases.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment