tinyllama-pretrained-custom
A custom pre-trained version of TinyLlama-1.1B for German language tasks.
Features
- Fine-tuned on German text data
- Optimized for specific use cases
- Compact size (1.1B parameters)
Usage
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("SpookyFab/tinyllama-pretrained-custom")
model = AutoModelForCausalLM.from_pretrained("SpookyFab/tinyllama-pretrained-custom")
# Example usage
inputs = tokenizer("Hello, how are you?", return_tensors="pt")
outputs = model.generate(**inputs)
print(tokenizer.decode(outputs[0]))
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for SpookyFab/tinyllama-pretrained-custom
Base model
TinyLlama/TinyLlama-1.1B-Chat-v1.0