|
--- |
|
library_name: transformers |
|
tags: [] |
|
--- |
|
|
|
# Model Card for Ti*k*Zero+ (10b) |
|
Ti*k*Zero+ (10b) is a multimodal language model that automatically synthesizes |
|
scientific figures as editable, semantics-preserving |
|
[Ti*k*Z](https://github.com/pgf-tikz/pgf) graphics programs conditioned on text |
|
captions. It is based on [DeTi*k*Zify<sub>v2</sub> |
|
(8b)](https://huggingface.co/nllg/detikzify-v2-8b) and [LLaMA<sub>3.2</sub> |
|
(1b)](https://huggingface.co/meta-llama/Llama-3.2-1B), and integrates |
|
[Ti*k*Zero](https://huggingface.co/nllg/tikzero-adapter) with additional |
|
end-to-end fine-tuning. Check out the |
|
[DeTi*k*Zify](https://github.com/potamides/DeTikZify) project for more |
|
information and tips on how to best run the model. |
|
|
|
## Usage |
|
```python |
|
from detikzify.model import load |
|
from detikzify.infer import DetikzifyPipeline |
|
|
|
caption = "A multi-layer perceptron with two hidden layers." |
|
pipeline = DetikzifyPipeline(*load( |
|
model_name_or_path="nllg/tikzero-plus-10b", |
|
device_map="auto", |
|
torch_dtype="bfloat16", |
|
)) |
|
|
|
# generate a single TikZ program |
|
fig = pipeline.sample(text=caption) |
|
|
|
# if it compiles, rasterize it and show it |
|
if fig.is_rasterizable: |
|
fig.rasterize().show() |
|
``` |
|
|
|
## Acknowledgments |
|
This model was trained using computational resources provided by the |
|
bwForCluster Helix, as part of the bwHPC-S5 project. The authors acknowledge |
|
support from the state of Baden-Württemberg through the bwHPC initiative and |
|
the German Research Foundation (DFG) under grant INST 35/1597-1 FUGG. |