MASR / transformers /docs /source /ko /installation.md
Yuvarraj's picture
Initial commit
a0db2f9

์„ค์น˜๋ฐฉ๋ฒ•[[installation]]

๐Ÿค— Transformers๋ฅผ ์‚ฌ์šฉ ์ค‘์ธ ๋”ฅ๋Ÿฌ๋‹ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ์— ๋งž์ถฐ ์„ค์น˜ํ•˜๊ณ , ์บ์‹œ๋ฅผ ๊ตฌ์„ฑํ•˜๊ฑฐ๋‚˜ ์„ ํƒ์ ์œผ๋กœ ์˜คํ”„๋ผ์ธ์—์„œ๋„ ์‹คํ–‰ํ•  ์ˆ˜ ์žˆ๋„๋ก ๐Ÿค— Transformers๋ฅผ ์„ค์ •ํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ๋ฐฐ์šฐ๊ฒ ์Šต๋‹ˆ๋‹ค.

๐Ÿค— Transformers๋Š” Python 3.6+, PyTorch 1.1.0+, TensorFlow 2.0+ ๋ฐ Flax์—์„œ ํ…Œ์ŠคํŠธ๋˜์—ˆ์Šต๋‹ˆ๋‹ค. ๋”ฅ๋Ÿฌ๋‹ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์„ค์น˜ํ•˜๋ ค๋ฉด ์•„๋ž˜ ๋งํฌ๋œ ์ €๋งˆ๋‹ค์˜ ๊ณต์‹ ์‚ฌ์ดํŠธ๋ฅผ ์ฐธ๊ณ ํ•ด์ฃผ์„ธ์š”.

  • PyTorch ์„ค์น˜ํ•˜๊ธฐ
  • TensorFlow 2.0 ์„ค์น˜ํ•˜๊ธฐ
  • Flax ์„ค์น˜ํ•˜๊ธฐ

pip์œผ๋กœ ์„ค์น˜ํ•˜๊ธฐ[[install-with-pip]]

๐Ÿค— Transformers๋ฅผ ๊ฐ€์ƒ ํ™˜๊ฒฝ์— ์„ค์น˜ํ•˜๋Š” ๊ฒƒ์„ ์ถ”์ฒœ๋“œ๋ฆฝ๋‹ˆ๋‹ค. Python ๊ฐ€์ƒ ํ™˜๊ฒฝ์— ์ต์ˆ™ํ•˜์ง€ ์•Š๋‹ค๋ฉด, ์ด ๊ฐ€์ด๋“œ๋ฅผ ์ฐธ๊ณ ํ•˜์„ธ์š”. ๊ฐ€์ƒ ํ™˜๊ฒฝ์„ ์‚ฌ์šฉํ•˜๋ฉด ์„œ๋กœ ๋‹ค๋ฅธ ํ”„๋กœ์ ํŠธ๋“ค์„ ๋ณด๋‹ค ์‰ฝ๊ฒŒ ๊ด€๋ฆฌํ•  ์ˆ˜ ์žˆ๊ณ , ์˜์กด์„ฑ ๊ฐ„์˜ ํ˜ธํ™˜์„ฑ ๋ฌธ์ œ๋ฅผ ๋ฐฉ์ง€ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

๋จผ์ € ํ”„๋กœ์ ํŠธ ๋””๋ ‰ํ† ๋ฆฌ์—์„œ ๊ฐ€์ƒ ํ™˜๊ฒฝ์„ ๋งŒ๋“ค์–ด ์ค๋‹ˆ๋‹ค.

python -m venv .env

๊ฐ€์ƒ ํ™˜๊ฒฝ์„ ํ™œ์„ฑํ™”ํ•ด์ฃผ์„ธ์š”. Linux๋‚˜ MacOS์˜ ๊ฒฝ์šฐ:

source .env/bin/activate

Windows์˜ ๊ฒฝ์šฐ:

.env/Scripts/activate

์ด์ œ ๐Ÿค— Transformers๋ฅผ ์„ค์น˜ํ•  ์ค€๋น„๊ฐ€ ๋˜์—ˆ์Šต๋‹ˆ๋‹ค. ๋‹ค์Œ ๋ช…๋ น์„ ์ž…๋ ฅํ•ด์ฃผ์„ธ์š”.

pip install transformers

CPU๋งŒ ์จ๋„ ๋œ๋‹ค๋ฉด, ๐Ÿค— Transformers์™€ ๋”ฅ๋Ÿฌ๋‹ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ๋‹จ 1์ค„๋กœ ์„ค์น˜ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด ๐Ÿค— Transformers์™€ PyTorch์˜ ๊ฒฝ์šฐ:

pip install transformers[torch]

๐Ÿค— Transformers์™€ TensorFlow 2.0์˜ ๊ฒฝ์šฐ:

pip install transformers[tf-cpu]

๐Ÿค— Transformers์™€ Flax์˜ ๊ฒฝ์šฐ:

pip install transformers[flax]

๋งˆ์ง€๋ง‰์œผ๋กœ ๐Ÿค— Transformers๊ฐ€ ์ œ๋Œ€๋กœ ์„ค์น˜๋˜์—ˆ๋Š”์ง€ ํ™•์ธํ•  ์ฐจ๋ก€์ž…๋‹ˆ๋‹ค. ์‚ฌ์ „ํ›ˆ๋ จ๋œ ๋ชจ๋ธ์„ ๋‹ค์šด๋กœ๋“œํ•˜๋Š” ์ฝ”๋“œ์ž…๋‹ˆ๋‹ค.

python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('we love you'))"

๋ผ๋ฒจ๊ณผ ์ ์ˆ˜๊ฐ€ ์ถœ๋ ฅ๋˜๋ฉด ์ž˜ ์„ค์น˜๋œ ๊ฒƒ์ž…๋‹ˆ๋‹ค.

[{'label': 'POSITIVE', 'score': 0.9998704791069031}]

์†Œ์Šค์—์„œ ์„ค์น˜ํ•˜๊ธฐ[[install-from-source]]

๐Ÿค— Transformers๋ฅผ ์†Œ์Šค์—์„œ ์„ค์น˜ํ•˜๋ ค๋ฉด ์•„๋ž˜ ๋ช…๋ น์„ ์‹คํ–‰ํ•˜์„ธ์š”.

pip install git+https://github.com/huggingface/transformers

์œ„ ๋ช…๋ น์€ ์ตœ์‹ ์ด์ง€๋งŒ (์•ˆ์ •์ ์ธ) stable ๋ฒ„์ „์ด ์•„๋‹Œ ์‹คํ—˜์„ฑ์ด ์ง™์€ main ๋ฒ„์ „์„ ์„ค์น˜ํ•ฉ๋‹ˆ๋‹ค. main ๋ฒ„์ „์€ ๊ฐœ๋ฐœ ํ˜„ํ™ฉ๊ณผ ๋ฐœ๋งž์ถ”๋Š”๋ฐ ์œ ์šฉํ•ฉ๋‹ˆ๋‹ค. ์˜ˆ์‹œ๋กœ ๋งˆ์ง€๋ง‰ ๊ณต์‹ ๋ฆด๋ฆฌ์Šค ์ดํ›„ ๋ฐœ๊ฒฌ๋œ ๋ฒ„๊ทธ๊ฐ€ ํŒจ์น˜๋˜์—ˆ์ง€๋งŒ, ์ƒˆ ๋ฆด๋ฆฌ์Šค๋กœ ์•„์ง ๋กค์•„์›ƒ๋˜์ง€๋Š” ์•Š์€ ๊ฒฝ์šฐ๋ฅผ ๋“ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๋ฐ”๊ฟ” ๋งํ•˜๋ฉด main ๋ฒ„์ „์ด ์•ˆ์ •์„ฑ๊ณผ๋Š” ๊ฑฐ๋ฆฌ๊ฐ€ ์žˆ๋‹ค๋Š” ๋œป์ด๊ธฐ๋„ ํ•ฉ๋‹ˆ๋‹ค. ์ €ํฌ๋Š” main ๋ฒ„์ „์„ ์‚ฌ์šฉํ•˜๋Š”๋ฐ ๋ฌธ์ œ๊ฐ€ ์—†๋„๋ก ๋…ธ๋ ฅํ•˜๊ณ  ์žˆ์œผ๋ฉฐ, ๋Œ€๋ถ€๋ถ„์˜ ๋ฌธ์ œ๋Š” ๋Œ€๊ฐœ ๋ช‡ ์‹œ๊ฐ„์ด๋‚˜ ํ•˜๋ฃจ ์•ˆ์— ํ•ด๊ฒฐ๋ฉ๋‹ˆ๋‹ค. ๋งŒ์•ฝ ๋ฌธ์ œ๊ฐ€ ๋ฐœ์ƒํ•˜๋ฉด ์ด์Šˆ๋ฅผ ์—ด์–ด์ฃผ์‹œ๋ฉด ๋” ๋นจ๋ฆฌ ํ•ด๊ฒฐํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค!

์ „๊ณผ ๋งˆ์ฐฌ๊ฐ€์ง€๋กœ ๐Ÿค— Transformers๊ฐ€ ์ œ๋Œ€๋กœ ์„ค์น˜๋˜์—ˆ๋Š”์ง€ ํ™•์ธํ•  ์ฐจ๋ก€์ž…๋‹ˆ๋‹ค.

python -c "from transformers import pipeline; print(pipeline('sentiment-analysis')('I love you'))"

์ˆ˜์ • ๊ฐ€๋Šฅํ•œ ์„ค์น˜[[editable-install]]

์ˆ˜์ • ๊ฐ€๋Šฅํ•œ ์„ค์น˜๊ฐ€ ํ•„์š”ํ•œ ๊ฒฝ์šฐ๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค.

  • main ๋ฒ„์ „์˜ ์†Œ์Šค ์ฝ”๋“œ๋ฅผ ์‚ฌ์šฉํ•˜๊ธฐ ์œ„ํ•ด
  • ๐Ÿค— Transformers์— ๊ธฐ์—ฌํ•˜๊ณ  ์‹ถ์–ด์„œ ์ฝ”๋“œ์˜ ๋ณ€๊ฒฝ ์‚ฌํ•ญ์„ ํ…Œ์ŠคํŠธํ•˜๊ธฐ ์œ„ํ•ด

๋ฆฌํฌ์ง€ํ„ฐ๋ฆฌ๋ฅผ ๋ณต์ œํ•˜๊ณ  ๐Ÿค— Transformers๋ฅผ ์„ค์น˜ํ•˜๋ ค๋ฉด ๋‹ค์Œ ๋ช…๋ น์„ ์ž…๋ ฅํ•ด์ฃผ์„ธ์š”.

git clone https://github.com/huggingface/transformers.git
cd transformers
pip install -e .

์œ„ ๋ช…๋ น์€ ๋ฆฌํฌ์ง€ํ„ฐ๋ฆฌ๋ฅผ ๋ณต์ œํ•œ ์œ„์น˜์˜ ํด๋”์™€ Python ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ์˜ ๊ฒฝ๋กœ๋ฅผ ์—ฐ๊ฒฐ์‹œํ‚ต๋‹ˆ๋‹ค. Python์ด ์ผ๋ฐ˜ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ ๊ฒฝ๋กœ ์™ธ์— ๋ณต์ œํ•œ ํด๋” ๋‚ด๋ถ€๋ฅผ ํ™•์ธํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด Python ํŒจํ‚ค์ง€๊ฐ€ ์ผ๋ฐ˜์ ์œผ๋กœ ~/anaconda3/envs/main/lib/python3.7/site-packages/์— ์„ค์น˜๋˜์–ด ์žˆ๋Š”๋ฐ, ๋ช…๋ น์„ ๋ฐ›์€ Python์ด ์ด์ œ ๋ณต์ œํ•œ ํด๋”์ธ ~/transformers/๋„ ๊ฒ€์ƒ‰ํ•˜๊ฒŒ ๋ฉ๋‹ˆ๋‹ค.

๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ๊ณ„์† ์‚ฌ์šฉํ•˜๋ ค๋ฉด transformers ํด๋”๋ฅผ ๊ผญ ์œ ์ง€ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.

๋ณต์ œ๋ณธ์€ ์ตœ์‹  ๋ฒ„์ „์˜ ๐Ÿค— Transformers๋กœ ์‰ฝ๊ฒŒ ์—…๋ฐ์ดํŠธํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

cd ~/transformers/
git pull

Python ํ™˜๊ฒฝ์„ ๋‹ค์‹œ ์‹คํ–‰ํ•˜๋ฉด ์—…๋ฐ์ดํŠธ๋œ ๐Ÿค— Transformers์˜ main ๋ฒ„์ „์„ ์ฐพ์•„๋‚ผ ๊ฒƒ์ž…๋‹ˆ๋‹ค.

conda๋กœ ์„ค์น˜ํ•˜๊ธฐ[[install-with-conda]]

huggingface conda ์ฑ„๋„์—์„œ ์„ค์น˜ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

conda install -c huggingface transformers

์บ์‹œ ๊ตฌ์„ฑํ•˜๊ธฐ[[cache-setup]]

์‚ฌ์ „ํ›ˆ๋ จ๋œ ๋ชจ๋ธ์€ ๋‹ค์šด๋กœ๋“œ๋œ ํ›„ ๋กœ์ปฌ ๊ฒฝ๋กœ ~/.cache/huggingface/hub์— ์บ์‹œ๋ฉ๋‹ˆ๋‹ค. ์…ธ ํ™˜๊ฒฝ ๋ณ€์ˆ˜ TRANSFORMERS_CACHE์˜ ๊ธฐ๋ณธ ๋””๋ ‰ํ„ฐ๋ฆฌ์ž…๋‹ˆ๋‹ค. Windows์˜ ๊ฒฝ์šฐ ๊ธฐ๋ณธ ๋””๋ ‰ํ„ฐ๋ฆฌ๋Š” C:\Users\username\.cache\huggingface\hub์ž…๋‹ˆ๋‹ค. ์•„๋ž˜์˜ ์…ธ ํ™˜๊ฒฝ ๋ณ€์ˆ˜๋ฅผ (์šฐ์„  ์ˆœ์œ„) ์ˆœ์„œ๋Œ€๋กœ ๋ณ€๊ฒฝํ•˜์—ฌ ๋‹ค๋ฅธ ์บ์‹œ ๋””๋ ‰ํ† ๋ฆฌ๋ฅผ ์ง€์ •ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

  1. ์…ธ ํ™˜๊ฒฝ ๋ณ€์ˆ˜ (๊ธฐ๋ณธ): HUGGINGFACE_HUB_CACHE ๋˜๋Š” TRANSFORMERS_CACHE
  2. ์…ธ ํ™˜๊ฒฝ ๋ณ€์ˆ˜: HF_HOME
  3. ์…ธ ํ™˜๊ฒฝ ๋ณ€์ˆ˜: XDG_CACHE_HOME + /huggingface

๊ณผ๊ฑฐ ๐Ÿค— Transformers์—์„œ ์“ฐ์˜€๋˜ ์…ธ ํ™˜๊ฒฝ ๋ณ€์ˆ˜ PYTORCH_TRANSFORMERS_CACHE ๋˜๋Š” PYTORCH_PRETRAINED_BERT_CACHE์ด ์„ค์ •๋˜์žˆ๋‹ค๋ฉด, ์…ธ ํ™˜๊ฒฝ ๋ณ€์ˆ˜ TRANSFORMERS_CACHE์„ ์ง€์ •ํ•˜์ง€ ์•Š๋Š” ํ•œ ์šฐ์„  ์‚ฌ์šฉ๋ฉ๋‹ˆ๋‹ค.

์˜คํ”„๋ผ์ธ ๋ชจ๋“œ[[offline-mode]]

๐Ÿค— Transformers๋ฅผ ๋กœ์ปฌ ํŒŒ์ผ๋งŒ ์‚ฌ์šฉํ•˜๋„๋ก ํ•ด์„œ ๋ฐฉํ™”๋ฒฝ ๋˜๋Š” ์˜คํ”„๋ผ์ธ ํ™˜๊ฒฝ์—์„œ ์‹คํ–‰ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ํ™œ์„ฑํ™”ํ•˜๋ ค๋ฉด TRANSFORMERS_OFFLINE=1 ํ™˜๊ฒฝ ๋ณ€์ˆ˜๋ฅผ ์„ค์ •ํ•˜์„ธ์š”.

HF_DATASETS_OFFLINE=1 ํ™˜๊ฒฝ ๋ณ€์ˆ˜๋ฅผ ์„ค์ •ํ•˜์—ฌ ์˜คํ”„๋ผ์ธ ํ›ˆ๋ จ ๊ณผ์ •์— ๐Ÿค— Datasets์„ ์ถ”๊ฐ€ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

์˜ˆ๋ฅผ ๋“ค์–ด ์™ธ๋ถ€ ๊ธฐ๊ธฐ ์‚ฌ์ด์— ๋ฐฉํ™”๋ฒฝ์„ ๋‘” ์ผ๋ฐ˜ ๋„คํŠธ์›Œํฌ์—์„œ ํ‰์†Œ์ฒ˜๋Ÿผ ํ”„๋กœ๊ทธ๋žจ์„ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ์‹คํ–‰ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

python examples/pytorch/translation/run_translation.py --model_name_or_path t5-small --dataset_name wmt16 --dataset_config ro-en ...

์˜คํ”„๋ผ์ธ ๊ธฐ๊ธฐ์—์„œ ๋™์ผํ•œ ํ”„๋กœ๊ทธ๋žจ์„ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ์‹คํ–‰ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

HF_DATASETS_OFFLINE=1 TRANSFORMERS_OFFLINE=1 \
python examples/pytorch/translation/run_translation.py --model_name_or_path t5-small --dataset_name wmt16 --dataset_config ro-en ...

์ด์ œ ์Šคํฌ๋ฆฝํŠธ๋Š” ๋กœ์ปฌ ํŒŒ์ผ์— ํ•œํ•ด์„œ๋งŒ ๊ฒ€์ƒ‰ํ•  ๊ฒƒ์ด๋ฏ€๋กœ, ์Šคํฌ๋ฆฝํŠธ๊ฐ€ ์ค‘๋‹จ๋˜๊ฑฐ๋‚˜ ์‹œ๊ฐ„์ด ์ดˆ๊ณผ๋  ๋•Œ๊นŒ์ง€ ๋ฉˆ์ถฐ์žˆ์ง€ ์•Š๊ณ  ์ž˜ ์‹คํ–‰๋  ๊ฒƒ์ž…๋‹ˆ๋‹ค.

์˜คํ”„๋ผ์ธ์šฉ ๋ชจ๋ธ ๋ฐ ํ† ํฌ๋‚˜์ด์ € ๋งŒ๋“ค์–ด๋‘๊ธฐ[[fetch-models-and-tokenizers-to-use-offline]]

Another option for using ๐Ÿค— Transformers offline is to download the files ahead of time, and then point to their local path when you need to use them offline. There are three ways to do this: ๐Ÿค— Transformers๋ฅผ ์˜คํ”„๋ผ์ธ์œผ๋กœ ์‚ฌ์šฉํ•˜๋Š” ๋˜ ๋‹ค๋ฅธ ๋ฐฉ๋ฒ•์€ ํŒŒ์ผ์„ ๋ฏธ๋ฆฌ ๋‹ค์šด๋กœ๋“œํ•œ ๋‹ค์Œ, ์˜คํ”„๋ผ์ธ์ผ ๋•Œ ์‚ฌ์šฉํ•  ๋กœ์ปฌ ๊ฒฝ๋กœ๋ฅผ ์ง€์ •ํ•ด๋‘๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. 3๊ฐ€์ง€ ์ค‘ ํŽธํ•œ ๋ฐฉ๋ฒ•์„ ๊ณ ๋ฅด์„ธ์š”.

  • Model Hub์˜ UI๋ฅผ ํ†ตํ•ด ํŒŒ์ผ์„ ๋‹ค์šด๋กœ๋“œํ•˜๋ ค๋ฉด โ†“ ์•„์ด์ฝ˜์„ ํด๋ฆญํ•˜์„ธ์š”.

    download-icon

  • [PreTrainedModel.from_pretrained]์™€ [PreTrainedModel.save_pretrained] ์›Œํฌํ”Œ๋กœ๋ฅผ ํ™œ์šฉํ•˜์„ธ์š”.

    1. ๋ฏธ๋ฆฌ [PreTrainedModel.from_pretrained]๋กœ ํŒŒ์ผ์„ ๋‹ค์šด๋กœ๋“œํ•ด๋‘์„ธ์š”.
    >>> from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
    
    >>> tokenizer = AutoTokenizer.from_pretrained("bigscience/T0_3B")
    >>> model = AutoModelForSeq2SeqLM.from_pretrained("bigscience/T0_3B")
    
    1. [PreTrainedModel.save_pretrained]๋กœ ์ง€์ •๋œ ๊ฒฝ๋กœ์— ํŒŒ์ผ์„ ์ €์žฅํ•ด๋‘์„ธ์š”.
    >>> tokenizer.save_pretrained("./your/path/bigscience_t0")
    >>> model.save_pretrained("./your/path/bigscience_t0")
    
    1. ์ด์ œ ์˜คํ”„๋ผ์ธ์ผ ๋•Œ [PreTrainedModel.from_pretrained]๋กœ ์ €์žฅํ•ด๋’€๋˜ ํŒŒ์ผ์„ ์ง€์ •๋œ ๊ฒฝ๋กœ์—์„œ ๋‹ค์‹œ ๋ถˆ๋Ÿฌ์˜ค์„ธ์š”.
    >>> tokenizer = AutoTokenizer.from_pretrained("./your/path/bigscience_t0")
    >>> model = AutoModel.from_pretrained("./your/path/bigscience_t0")
    
  • huggingface_hub ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ํ™œ์šฉํ•ด์„œ ํŒŒ์ผ์„ ๋‹ค์šด๋กœ๋“œํ•˜์„ธ์š”.

    1. ๊ฐ€์ƒํ™˜๊ฒฝ์— huggingface_hub ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ์„ค์น˜ํ•˜์„ธ์š”.
    python -m pip install huggingface_hub
    
    1. hf_hub_download ํ•จ์ˆ˜๋กœ ํŒŒ์ผ์„ ํŠน์ • ์œ„์น˜์— ๋‹ค์šด๋กœ๋“œํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด ์•„๋ž˜ ๋ช…๋ น์€ T0 ๋ชจ๋ธ์˜ config.json ํŒŒ์ผ์„ ์ง€์ •๋œ ๊ฒฝ๋กœ์— ๋‹ค์šด๋กœ๋“œํ•ฉ๋‹ˆ๋‹ค.
    >>> from huggingface_hub import hf_hub_download
    
    >>> hf_hub_download(repo_id="bigscience/T0_3B", filename="config.json", cache_dir="./your/path/bigscience_t0")
    

ํŒŒ์ผ์„ ๋‹ค์šด๋กœ๋“œํ•˜๊ณ  ๋กœ์ปฌ์— ์บ์‹œ ํ•ด๋†“๊ณ  ๋‚˜๋ฉด, ๋‚˜์ค‘์— ๋ถˆ๋Ÿฌ์™€ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ๋„๋ก ๋กœ์ปฌ ๊ฒฝ๋กœ๋ฅผ ์ง€์ •ํ•ด๋‘์„ธ์š”.

>>> from transformers import AutoConfig

>>> config = AutoConfig.from_pretrained("./your/path/bigscience_t0/config.json")

Hub์— ์ €์žฅ๋œ ํŒŒ์ผ์„ ๋‹ค์šด๋กœ๋“œํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ๋” ์ž์„ธํžˆ ์•Œ์•„๋ณด๋ ค๋ฉด Hub์—์„œ ํŒŒ์ผ ๋‹ค์šด๋กœ๋“œํ•˜๊ธฐ ์„น์…˜์„ ์ฐธ๊ณ ํ•ด์ฃผ์„ธ์š”.