| # AToMiC Prebuilt Indexes | |
| ## Example Usage: | |
| ### Reproduction | |
| Toolkits: | |
| https://github.com/TREC-AToMiC/AToMiC/tree/main/examples/dense_retriever_baselines | |
| ```bash | |
| # Skip the encode and index steps, search with the prebuilt indexes and topics directly | |
| python search.py \ | |
| --topics topics/openai.clip-vit-base-patch32.text.validation \ | |
| --index indexes/openai.clip-vit-base-patch32.image.faiss.flat \ | |
| --hits 1000 \ | |
| --output runs/run.openai.clip-vit-base-patch32.validation.t2i.large.trec | |
| python search.py \ | |
| --topics topics/openai.clip-vit-base-patch32.image.validation \ | |
| --index indexes/openai.clip-vit-base-patch32.text.faiss.flat \ | |
| --hits 1000 \ | |
| --output runs/run.openai.clip-vit-base-patch32.validation.i2t.large.trec | |
| ``` | |
| ### Explore AToMiC datasets | |
| ```python | |
| import torch | |
| from pathlib import Path | |
| from datasets import load_dataset | |
| from transformers import AutoModel, AutoProcessor | |
| INDEX_DIR='indexes' | |
| INDEX_NAME='openai.clip-vit-base-patch32.image.faiss.flat' | |
| QUERY = 'Elizabeth II' | |
| images = load_dataset('TREC-AToMiC/AToMiC-Images-v0.2', split='train') | |
| images.load_faiss_index(index_name=INDEX_NAME, file=Path(INDEX_DIR, INDEX_NAME, 'index')) | |
| model = AutoModel.from_pretrained('openai/clip-vit-base-patch32') | |
| processor = AutoProcessor.from_pretrained('openai/clip-vit-base-patch32') | |
| # prebuilt indexes contain L2-normalized vectors | |
| with torch.no_grad(): | |
| q_embedding = model.get_text_features(**processor(text=query, return_tensors="pt")) | |
| q_embedding = torch.nn.functional.normalize(q_embedding, dim=-1).detach().numpy() | |
| scores, retrieved = images.get_nearest_examples(index_name, q_embedding, k=10) | |
| ``` |