Argilla is a collaboration tool for AI engineers and domain experts who need to build high quality datasets for their projects.
Argilla can be used for collecting human feedback for a wide variety of AI projects like traditional NLP (text classification, NER, etc.), LLMs (RAG, preference tuning, etc.), or multimodal models (text to image, etc.). Argilla’s programmatic approach lets you build workflows for continuous evaluation and model improvement. The goal of Argilla is to ensure your data work pays off by quickly iterating on the right data and models.
The community uses Argilla to create amazing open-source datasets and models.
Argilla contributed some models and datasets to open-source too.
AI teams from companies like the Red Cross, Loris.ai and Prolific use Argilla to improve the quality and efficiency of AI projects. They shared their experiences in our AI community meetup.
First login with your Hugging Face account:
huggingface-cli login
Make sure you have argilla>=2.0.0
installed:
pip install -U argilla
Lastly, you will need to deploy the Argilla server and UI, which can be done easily on the Hugging Face Hub.
This guide shows how to import and export your dataset to the Hugging Face Hub.
In Argilla, you can import/export two main components of a dataset:
rg.Settings
. This is useful if your want to share your feedback task or restore it later in Argilla.Metadata
, Vectors
, Suggestions
, and Responses
. This is useful if you want to use your dataset’s records outside of Argilla.You can push a dataset from Argilla to the Hugging Face Hub. This is useful if you want to share your dataset with the community or version control it. You can push the dataset to the Hugging Face Hub using the rg.Dataset.to_hub
method.
import argilla as rg
client = rg.Argilla(api_url="<api_url>", api_key="<api_key>")
dataset = client.datasets(name="my_dataset")
dataset.to_hub(repo_id="<repo_id>")
The example above will push the dataset’s Settings
and records to the hub. If you only want to push the dataset’s configuration, you can set the with_records
parameter to False
. This is useful if you’re just interested in a specific dataset template or you want to make changes in the dataset settings and/or records.
dataset.to_hub(repo_id="<repo_id>", with_records=False)
You can pull a dataset from the Hugging Face Hub to Argilla. This is useful if you want to restore a dataset and its configuration. You can pull the dataset from the Hugging Face Hub using the rg.Dataset.from_hub
method.
import argilla as rg
client = rg.Argilla(api_url="<api_url>", api_key="<api_key>")
dataset = rg.Dataset.from_hub(repo_id="<repo_id>")
The rg.Dataset.from_hub
method loads the configuration and records from the dataset repo. If you only want to load records, you can pass a datasets.Dataset
object to the rg.Dataset.log
method. This enables you to configure your own dataset and reuse existing Hub datasets.
The example above will pull the dataset’s Settings
and records from the hub. If you only want to pull the dataset’s configuration, you can set the with_records
parameter to False
. This is useful if you’re just interested in a specific dataset template or you want to make changes in the dataset settings and/or records.
dataset = rg.Dataset.from_hub(repo_id="<repo_id>", with_records=False)
With the dataset’s configuration you could then make changes to the dataset. For example, you could adapt the dataset’s settings for a different task:
dataset.settings.questions = [rg.TextQuestion(name="answer")]
You could then log the dataset’s records using the load_dataset
method of the datasets
package and pass the dataset to the rg.Dataset.log
method.
hf_dataset = load_dataset("<repo_id>")
dataset.log(hf_dataset)