airoboros-9b-3.2 / README.md
macadeliccc's picture
Update README.md
84bd18e verified
metadata
language:
  - en
license: apache-2.0
tags:
  - text-generation-inference
  - transformers
  - unsloth
  - llama
  - trl
  - sft
base_model: 01-ai/Yi-9B
datasets: jondurbin/airoboros-3.2

airoboros-9b-3.2

Prompt Template: ChatML

from openai import OpenAI
# Set OpenAI's API key and API base to use vLLM's API server.
openai_api_key = "EMPTY"
openai_api_base = "http://localhost:8000/v1"

client = OpenAI(
    api_key=openai_api_key,
    base_url=openai_api_base,
)

system = """
BEGININPUT
BEGINCONTEXT
date: 2021-01-01
url: https://web.site/123
ENDCONTEXT
In a shocking turn of events, blueberries are now green, but will be sticking with the same name.
ENDINPUT
BEGININSTRUCTION
{user}
ENDINSTRUCTION
"""

user = "What is the new color of blueberries?"
chat_response = client.chat.completions.create(
    model="macadeliccc/airoboros-9b-3.2",
    messages=[
        {"role": "system", "content": system},
        {"role": "user", "content": user},
    ]
)
print("Chat response:", chat_response)

Expected Output

Chat response: ChatCompletion(id='cmpl-6bce7c051ffd41878624683faea90719', choices=[Choice(finish_reason='stop', index=0, logprobs=None, message=ChatCompletionMessage(content='Blueberries are now green.', role='assistant', function_call=None, tool_calls=None))], created=273292, model='macadeliccc/airoboros-9b-3.2', object='chat.completion', system_fingerprint=None, usage=CompletionUsage(completion_tokens=7, prompt_tokens=119, total_tokens=126))
  • Developed by: macadeliccc
  • License: apache-2.0
  • Finetuned from model : 01-ai/Yi-9B

This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.