tikzero-adapter / README.md
potamides's picture
Update README.md
b74ffba verified
metadata
library_name: transformers
tags: []

Model Card for TikZero Adapters

TikZero adapters can be loaded into DeTikZifyv2 (8B), a multimodal language model that converts sketches and scientific figures into editable, semantics-preserving TikZ graphics programs, to enable text caption conditioning. Check out the DeTikZify project for more information and tips on how to best run the model.

Usage

The default adapter uses cosine distance training, while an alternative variant trained with MSE can be loaded by specifying adapter_kwargs=dict(revision="mse") in the load_adapter function.

from detikzify.model import load, load_adapter
from detikzify.infer import DetikzifyPipeline

caption = "A multi-layer perceptron with two hidden layers."
pipeline = DetikzifyPipeline(
    *load_adapter(
        *load(
            model_name_or_path="nllg/detikzify-v2-8b",
            device_map="auto",
            torch_dtype="bfloat16",
        ),
        adapter_name_or_path="nllg/tikzero-adapter",
        #adapter_kwargs=dict(revision="mse") # load variant trained with MSE
    )
)

# generate a single TikZ program
fig = pipeline.sample(text=caption)

# if it compiles, rasterize it and show it
if fig.is_rasterizable:
    fig.rasterize().show()

Acknowledgments

This model was trained using computational resources provided by the bwForCluster Helix, as part of the bwHPC-S5 project. The authors acknowledge support from the state of Baden-Württemberg through the bwHPC initiative and the German Research Foundation (DFG) under grant INST 35/1597-1 FUGG.