Add transformers tag and example code snippet
Browse filesCongrats on the release and on making it to top 100 trending models!
Following my colleague Aritra's suggestion [here](https://huggingface.co/PerceptronAI/Isaac-0.1/discussions/1), I’ve added some tags and a code snippet 😊
    	
        README.md
    CHANGED
    
    | @@ -1,9 +1,13 @@ | |
| 1 | 
            -
            ---
         | 
| 2 | 
            -
            license: cc-by-nc-4.0
         | 
| 3 | 
            -
            base_model:
         | 
| 4 | 
            -
            - Qwen/Qwen3-1.7B
         | 
| 5 | 
            -
            - google/siglip2-so400m-patch14-384
         | 
| 6 | 
            -
             | 
|  | |
|  | |
|  | |
|  | |
| 7 |  | 
| 8 | 
             
            # [Isaac-0.1 by Perceptron](https://www.perceptron.inc/blog/introducing-isaac-0-1)
         | 
| 9 | 
             
            *Note this is the Post-trained model* [Try out the model on our playground](https://www.perceptron.inc/demo)
         | 
| @@ -37,8 +41,26 @@ A new interaction pattern where language and vision stay in lockstep: every clai | |
| 37 | 
             
            
         | 
| 38 |  | 
| 39 | 
             
            ## Example 
         | 
|  | |
| 40 | 
             
            ```bash
         | 
| 41 | 
             
            pip install perceptron
         | 
| 42 | 
             
            ```
         | 
| 43 |  | 
| 44 | 
            -
             | 
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | 
|  | |
| 1 | 
            +
            ---
         | 
| 2 | 
            +
            license: cc-by-nc-4.0
         | 
| 3 | 
            +
            base_model:
         | 
| 4 | 
            +
            - Qwen/Qwen3-1.7B
         | 
| 5 | 
            +
            - google/siglip2-so400m-patch14-384
         | 
| 6 | 
            +
            library_name: transformers
         | 
| 7 | 
            +
            tags:
         | 
| 8 | 
            +
            - perceptron
         | 
| 9 | 
            +
            - issac-0.1
         | 
| 10 | 
            +
            ---
         | 
| 11 |  | 
| 12 | 
             
            # [Isaac-0.1 by Perceptron](https://www.perceptron.inc/blog/introducing-isaac-0-1)
         | 
| 13 | 
             
            *Note this is the Post-trained model* [Try out the model on our playground](https://www.perceptron.inc/demo)
         | 
|  | |
| 41 | 
             
            
         | 
| 42 |  | 
| 43 | 
             
            ## Example 
         | 
| 44 | 
            +
             | 
| 45 | 
             
            ```bash
         | 
| 46 | 
             
            pip install perceptron
         | 
| 47 | 
             
            ```
         | 
| 48 |  | 
| 49 | 
            +
            ## Example using transformers
         | 
| 50 | 
            +
             | 
| 51 | 
            +
            Learn more: [Huggingface Example Repo](https://github.com/perceptron-ai-inc/perceptron/tree/main/huggingface)
         | 
| 52 | 
            +
             | 
| 53 | 
            +
            ```bash
         | 
| 54 | 
            +
            !git clone https://github.com/perceptron-ai-inc/perceptron.git
         | 
| 55 | 
            +
            !cp -r perceptron/huggingface ./huggingface
         | 
| 56 | 
            +
            ```
         | 
| 57 | 
            +
             | 
| 58 | 
            +
            ```python
         | 
| 59 | 
            +
            from transformers import AutoTokenizer, AutoConfig, AutoModelForCausalLM
         | 
| 60 | 
            +
            from huggingface.modular_isaac import IsaacProcessor
         | 
| 61 | 
            +
             | 
| 62 | 
            +
            tokenizer = AutoTokenizer.from_pretrained("PerceptronAI/Isaac-0.1", trust_remote_code=True, use_fast=False)
         | 
| 63 | 
            +
            config = AutoConfig.from_pretrained("PerceptronAI/Isaac-0.1", trust_remote_code=True)
         | 
| 64 | 
            +
            processor = IsaacProcessor(tokenizer=tokenizer, config=config)
         | 
| 65 | 
            +
            model = AutoModelForCausalLM.from_pretrained("PerceptronAI/Isaac-0.1", trust_remote_code=True)
         | 
| 66 | 
            +
            ```
         | 

