metadata
viewer: false
tags:
- uv-script
Atlas Export
Generate and deploy interactive embedding visualizations to HuggingFace Spaces with a single command using Apple's Embedding Atlas library!
Quick Start
# Create a Space from any text dataset
uv run atlas-export.py stanfordnlp/imdb --space-name my-imdb-viz
# Your Space will be live at:
# https://huggingface.co/spaces/YOUR_USERNAME/my-imdb-viz
⚠️ Note: Currently this approach is limited by storage capacity for files on Spaces so for big datasets consider using the --sample
option to limit the number of points visualized!
Examples
Image Datasets
# Visualize image datasets with CLIP
uv run atlas-export.py \
beans \
--space-name bean-disease-atlas \
--image-column image \
--model openai/clip-vit-base-patch32
Custom Embeddings
# Use a specific embedding model
uv run atlas-export.py \
wikipedia \
--space-name wiki-viz \
--model nomic-ai/nomic-embed-text-v1.5 \
--text-column text \
--sample 50000
Pre-computed Embeddings
# If you already have embeddings in your dataset
uv run atlas-export.py \
my-dataset-with-embeddings \
--space-name my-viz \
--no-compute-embeddings \
--x-column umap_x \
--y-column umap_y
GPU Acceleration (HF Jobs)
# First, get your HF token (if not already set)
python -c "from huggingface_hub import get_token; print(get_token())"
# Run on HF Jobs with GPU using experimental UV support
hf jobs uv run --flavor t4-small \
-s HF_TOKEN=your-token-here \
https://huggingface.co/datasets/uv-scripts/build-atlas/raw/main/atlas-export.py \
stanfordnlp/imdb \
--space-name imdb-viz \
--model sentence-transformers/all-mpnet-base-v2 \
--sample 10000
# With larger GPU and custom batch size for faster processing
hf jobs uv run --flavor a10g-large \
-s HF_TOKEN=your-token-here \
https://huggingface.co/datasets/uv-scripts/build-atlas/raw/main/atlas-export.py \
your-dataset \
--space-name your-atlas \
--batch-size 64 \
--sample 50000 \
--text-column output
Note: Replace your-token-here
with your actual token. Available GPU flavors: t4-small
, t4-medium
, l4x1
, a10g-small
, a10g-large
.
Key Options
Option | Description | Default |
---|---|---|
dataset_id |
HuggingFace dataset to visualize | Required |
--space-name |
Name for your Space | Required |
--model |
Embedding model to use | Auto-selected |
--text-column |
Column containing text | "text" |
--image-column |
Column containing images | None |
--sample |
Number of samples to visualize | All |
--batch-size |
Batch size for embedding generation | 32 (text), 16 (images) |
--split |
Dataset split to use | "train" |
--local-only |
Generate locally without deploying | False |
--output-dir |
Local output directory | Temp dir |
--hf-token |
HuggingFace API token | From env/CLI |
Run without arguments to see all options and more examples.
How It Works
- Loads dataset from HuggingFace Hub
- Generates embeddings (or uses pre-computed)
- Creates static web app with embedded data
- Deploys to HF Space
The resulting visualization runs entirely in the browser using WebGPU acceleration.
Credits
Built on Embedding Atlas by Apple. See the documentation for more details about the underlying technology.
Part of the UV Scripts collection 🚀