ret_demo / readme.md
cfli's picture
Update readme.md
0a7e00e verified

用于演示多语言检索的demo

下载

在一个bash脚本里运行以下代码

git clone https://hf-mirror.com/datasets/cfli/ret_demo
# git clone 的数据不完整,需要全部删除,然后重新下载
for lang in "ar" "bn" "de" "en" "es" "fa" "fi" "fr" "hi" "id" "ja" "ko" "ru" "sw" "te" "th" "yo" "zh"
do
    # 删除旧文件
    rm -f ./ret_demo/data/${lang}/corpus.jsonl
    rm -f ./ret_demo/data/${lang}/dev_qrels.jsonl
    rm -f ./ret_demo/data/${lang}/dev_queries.jsonl
    rm -f ./ret_demo/emb/${lang}/corpus.npy
done

for lang in "ar" "bn" "de" "en" "es" "fa" "fi" "fr" "hi" "id" "ja" "ko" "ru" "sw" "te" "th" "yo" "zh"
do
    # 下载并移动文件
    wget https://hf-mirror.com/datasets/cfli/ret_demo/resolve/main/emb/${lang}/corpus.npy
    mv corpus.npy ./ret_demo/emb/${lang}/

    wget https://hf-mirror.com/datasets/cfli/ret_demo/resolve/main/data/${lang}/corpus.jsonl
    mv corpus.jsonl ./ret_demo/data/${lang}/

    wget https://hf-mirror.com/datasets/cfli/ret_demo/resolve/main/data/${lang}/dev_qrels.jsonl
    mv dev_qrels.jsonl ./ret_demo/data/${lang}/
    
    wget https://hf-mirror.com/datasets/cfli/ret_demo/resolve/main/data/${lang}/dev_queries.jsonl
    mv dev_queries.jsonl ./ret_demo/data/${lang}/
done

环境依赖

pip install gradio
pip install -U FlagEmbedding
pip install https://github.com/kyamagu/faiss-wheels/releases/download/v1.7.3/faiss_gpu-1.7.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl

使用

  1. 如果模型在之前未下载,运行app.py之前先设置export HF_ENDPOINT=https://hf-mirror.com

  2. 如果要先下载模型到本地某个指定路径,可以按如下代码下载

    import os
    os.environ['HF_ENDPOINT'] = 'https://hf-mirror.com'
    
    save_path = './save_model'
    
    from transformers import AutoTokenizer, AutoModel
    tokenizer = AutoTokenizer.from_pretrained('BAAI/bge-multilingual-gemma2')
    model = AutoModel.from_pretrained('BAAI/bge-multilingual-gemma2')
    
    tokenizer.save_pretrained(save_path)
    model.save_pretrained(save_path)
    
  3. 需要修改utils.py文件第 23 行模型名称代码,将self_model_path改为【上述save_pathself_model_path=model_path

  4. 如果要用CPU加载模型,需要修改utils.py文件第 25-45 行

    def load_model_util(previous_model, model_path):
        self_model_path = '' # 使用上述 save_path,或者 self_model_path = model_path
        if model_path == 'BAAI/bge-multilingual-gemma2':
            if previous_model is not None and previous_model.model_name_or_path == self_model_path:
                return previous_model
            model = FlagLLMModel(self_model_path,
                                query_instruction_for_retrieval="Given a question, retrieve Wikipedia passages that answer the question.",
                                query_instruction_format="<instruct>{}\n<query>{}",
                                use_fp16=False,
                                devices=['cpu'])
        else:
            if previous_model is not None and previous_model.model_name_or_path == model_path:
                return previous_model
            model = FlagAutoModel.from_finetuned(model_path,
                                                 use_fp16=False,
                                                 devices=['cpu'])
        if previous_model is not None:
            del previous_model
        return model
    
  5. 需要修改app.py第 11 行和第 12 行data_dirindex_dir的值,指向本地的数据/索引路径

  6. 如果需要保存每次的faiss index,修改app.py第 81 行,设置faiss.write_index(faiss_index, index_path)(读index与构造index时间相近,保存不是很必要)

  7. 运行app.py