Model Card for TikZero Adapters

TikZero adapters can be loaded into DeTikZifyv2 (8B), a multimodal language model that converts sketches and scientific figures into editable, semantics-preserving TikZ graphics programs, to enable text caption conditioning. Check out the DeTikZify project for more information and tips on how to best run the model.

Usage

The default adapter uses cosine distance training, while an alternative variant trained with MSE can be loaded by specifying adapter_kwargs=dict(revision="mse") in the load_adapter function.

from detikzify.model import load, load_adapter
from detikzify.infer import DetikzifyPipeline

caption = "A multi-layer perceptron with two hidden layers."
pipeline = DetikzifyPipeline(
    *load_adapter(
        *load(
            model_name_or_path="nllg/detikzify-v2-8b",
            device_map="auto",
            torch_dtype="bfloat16",
        ),
        adapter_name_or_path="nllg/tikzero-adapter",
        #adapter_kwargs=dict(revision="mse") # load variant trained with MSE
    )
)

# generate a single TikZ program
fig = pipeline.sample(text=caption)

# if it compiles, rasterize it and show it
if fig.is_rasterizable:
    fig.rasterize().show()

Acknowledgments

This model was trained using computational resources provided by the bwForCluster Helix, as part of the bwHPC-S5 project. The authors acknowledge support from the state of Baden-Württemberg through the bwHPC initiative and the German Research Foundation (DFG) under grant INST 35/1597-1 FUGG.

Downloads last month

-

Downloads are not tracked for this model. How to track
Safetensors
Model size
414M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Collection including nllg/tikzero-adapter