MASR / transformers /docs /source /ko /serialization.md
Yuvarraj's picture
Initial commit
a0db2f9

ONNX๋กœ ๋‚ด๋ณด๋‚ด๊ธฐ[[export-to-onnx]]

ํ”„๋กœ๋•์…˜ ํ™˜๊ฒฝ์— ๐Ÿค— Transformers ๋ชจ๋ธ์„ ๋ฐฐํฌํ•  ๋•Œ์—๋Š” ํŠน์ˆ˜ ๋Ÿฐํƒ€์ž„ ๋ฐ ํ•˜๋“œ์›จ์–ด ์œ„์— ์˜ฌ๋ฆฌ๊ณ  ์‹คํ–‰ํ•  ์ˆ˜ ์žˆ๋„๋ก ์ง๋ ฌํ™”๋œ ํ˜•์‹์œผ๋กœ ๋‚ด๋ณด๋‚ด๊ธฐ๋ฅผ ๊ถŒ์žฅํ•ฉ๋‹ˆ๋‹ค. ์ด ๊ฐ€์ด๋“œ์—์„œ๋Š” ๐Ÿค— Transformers ๋ชจ๋ธ์„ ONNX (Open Neural Network eXchange)๋กœ ๋‚ด๋ณด๋‚ด๋Š” ๋ฐฉ๋ฒ•์„ ์•ˆ๋‚ดํ•ฉ๋‹ˆ๋‹ค.

ONNX๋Š” ๋”ฅ๋Ÿฌ๋‹ ๋ชจ๋ธ์„ ํ‘œํ˜„ํ•˜๊ธฐ ์œ„ํ•œ ๊ณตํ†ต ํŒŒ์ผ ํ˜•์‹๊ณผ ์—ฐ์‚ฐ์ž๋“ค์„ ์ •์˜ํ•˜๋Š” ๊ฐœ๋ฐฉํ˜• ํ‘œ์ค€์œผ๋กœ์จ PyTorch, TensorFlow ๋“ฑ ๋‹ค์–‘ํ•œ ํ”„๋ ˆ์ž„์›Œํฌ์—์„œ ์ง€์›๋ฉ๋‹ˆ๋‹ค. ๋ชจ๋ธ์„ ONNX ํ˜•์‹์œผ๋กœ ๋‚ด๋ณด๋‚ด๋ฉด, (๋ณดํ†ต _์ค‘๊ฐ„ ํ‘œํ˜„ (Intermediate Representation; IR)_์ด๋ผ๊ณ  ๋ถˆ๋ฆฌ๋Š”) ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„๊ฐ€ ๊ตฌ์„ฑ๋ฉ๋‹ˆ๋‹ค. ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„๋Š” ์‹ ๊ฒฝ๋ง์„ ํ†ตํ•ด ๋ฐ์ดํ„ฐ๊ฐ€ ํ๋ฅด๋Š” ๋ฐฉ์‹, ์ฆ‰ ์–ด๋–ค ์—ฐ์‚ฐ์ด ์–ด๋Š ๋ถ€๋ถ„์— ์‚ฌ์šฉ๋˜์—ˆ๋Š”์ง€๋ฅผ ๋‚˜ํƒ€๋ƒ…๋‹ˆ๋‹ค.

ํ‘œ์ค€ ์—ฐ์‚ฐ ๋ฐ ๋ฐ์ดํ„ฐ ํ˜•์‹์„ ์‚ฌ์šฉํ•˜์—ฌ ๊ทธ๋ž˜ํ”„๋ฅผ ๋…ธ์ถœํ•˜๊ธฐ ๋•Œ๋ฌธ์— ONNX๋ฅผ ์‚ฌ์šฉํ•˜๋ฉด ํ”„๋ ˆ์ž„์›Œํฌ ๊ฐ„ ์ „ํ™˜์ด ์‰ฌ์›Œ์ง‘๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, PyTorch์—์„œ ํ›ˆ๋ จ๋œ ๋ชจ๋ธ์€ ONNX ํ˜•์‹์œผ๋กœ ๋‚ด๋ณด๋‚ธ ๋’ค, TensorFlow์—์„œ ๊ฐ€์ ธ์˜ฌ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๋ฌผ๋ก  ๊ทธ ๋ฐ˜๋Œ€๋„ ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.

๐Ÿค— Transformers๋Š” ๋ชจ๋ธ ์ฒดํฌํฌ์ธํŠธ๋ฅผ ONNX ๊ทธ๋ž˜ํ”„๋กœ ๋ณ€ํ™˜ํ•  ์ˆ˜ ์žˆ๊ฒŒ ํ•ด์ฃผ๋Š” transformers.onnx ํŒจํ‚ค์ง€๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค. ์ด๊ฑธ ๊ฐ€๋Šฅ์ผ€ ํ•˜๋Š” ๊ตฌ์„ฑ ๊ฐ์ฒด๋Š” ์—ฌ๋Ÿฌ ๋ชจ๋ธ ์•„ํ‚คํ…์ฒ˜๋ฅผ ๋Œ€์ƒ์œผ๋กœ ๋ฏธ๋ฆฌ ์ œ์ž‘๋˜์–ด ์žˆ์œผ๋ฉฐ, ๋‹ค๋ฅธ ์•„ํ‚คํ…์ฒ˜๋กœ๋„ ์‰ฝ๊ฒŒ ํ™•์žฅํ•  ์ˆ˜ ์žˆ๋„๋ก ์„ค๊ณ„๋˜์—ˆ์Šต๋‹ˆ๋‹ค.

๐Ÿค— Optimum์—์„œ optimum.exporters.onnx ํŒจํ‚ค์ง€๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๐Ÿค— Transformers ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ผ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค.

๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ธ ํ›„ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ์‚ฌ์šฉ๋  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

  • ์–‘์žํ™” ๋ฐ ๊ทธ๋ž˜ํ”„ ์ตœ์ ํ™”์™€ ๊ฐ™์€ ๊ธฐ์ˆ ์„ ํ†ตํ•ด ์ถ”๋ก ์— ์ตœ์ ํ™”ํ•ฉ๋‹ˆ๋‹ค.
  • ORTModelForXXX ํด๋ž˜์Šค๋ฅผ ํ†ตํ•ด ONNX ๋Ÿฐํƒ€์ž„์—์„œ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค. ์ด ํด๋ž˜์Šค๋“ค์€ ๐Ÿค— Transformers์—์„œ ์‚ฌ์šฉํ•˜๋Š” AutoModel API์™€ ๋™์ผํ•ฉ๋‹ˆ๋‹ค.
  • ์ตœ์ ํ™”๋œ ์ถ”๋ก  ํŒŒ์ดํ”„๋ผ์ธ ์œ„์— ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค. ์ด ํŒŒ์ดํ”„๋ผ์ธ์€ ๐Ÿค— Transformers์˜ [pipeline] ํ•จ์ˆ˜์™€ ๋™์ผํ•œ API๋ฅผ ๊ฐ–์Šต๋‹ˆ๋‹ค.

์ด๋Ÿฌํ•œ ๊ธฐ๋Šฅ์„ ๋ชจ๋‘ ์‚ดํŽด๋ณด๋ ค๋ฉด ๐Ÿค— Optimum ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ๋ฅผ ํ™•์ธํ•˜์„ธ์š”.

๋ฏธ๋ฆฌ ์ œ์ž‘๋œ ๊ตฌ์„ฑ์—๋Š” ๋‹ค์Œ ์•„ํ‚คํ…์ฒ˜๊ฐ€ ํฌํ•จ๋ฉ๋‹ˆ๋‹ค:

  • ALBERT
  • BART
  • BEiT
  • BERT
  • BigBird
  • BigBird-Pegasus
  • Blenderbot
  • BlenderbotSmall
  • BLOOM
  • CamemBERT
  • Chinese-CLIP
  • CLIP
  • CodeGen
  • Conditional DETR
  • ConvBERT
  • ConvNeXT
  • Data2VecText
  • Data2VecVision
  • DeBERTa
  • DeBERTa-v2
  • DeiT
  • DETR
  • DistilBERT
  • EfficientNet
  • ELECTRA
  • ERNIE
  • FlauBERT
  • GPT Neo
  • GPT-J
  • GPT-Sw3
  • GroupViT
  • I-BERT
  • ImageGPT
  • LayoutLM
  • LayoutLMv3
  • LeViT
  • Longformer
  • LongT5
  • M2M100
  • Marian
  • mBART
  • MEGA
  • MobileBERT
  • MobileNetV1
  • MobileNetV2
  • MobileViT
  • MT5
  • OpenAI GPT-2
  • OWL-ViT
  • Perceiver
  • PLBart
  • PoolFormer
  • RemBERT
  • ResNet
  • RoBERTa
  • RoBERTa-PreLayerNorm
  • RoFormer
  • SegFormer
  • SqueezeBERT
  • Swin Transformer
  • T5
  • Table Transformer
  • Vision Encoder decoder
  • ViT
  • Whisper
  • X-MOD
  • XLM
  • XLM-RoBERTa
  • XLM-RoBERTa-XL
  • YOLOS

์•ž์œผ๋กœ์˜ ๋‘ ์„น์…˜์—์„œ๋Š” ์•„๋ž˜ ๋‚ด์šฉ์„ ์‚ดํŽด๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค:

  • transformers.onnx ํŒจํ‚ค์ง€๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์ง€์›๋˜๋Š” ๋ชจ๋ธ ๋‚ด๋ณด๋‚ด๊ธฐ
  • ์ง€์›๋˜์ง€ ์•Š๋Š” ์•„ํ‚คํ…์ฒ˜๋ฅผ ์œ„ํ•ด ์‚ฌ์šฉ์ž ์ •์˜ ๋ชจ๋ธ ๋‚ด๋ณด๋‚ด๊ธฐ

๋ชจ๋ธ์„ ONNX๋กœ ๋‚ด๋ณด๋‚ด๊ธฐ[[exporting-a-model-to-onnx]]

์ด์ œ ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ผ ๋•Œ optimum.exporters.onnx๋ฅผ ์‚ฌ์šฉํ•˜๋„๋ก ๊ถŒ์žฅํ•ฉ๋‹ˆ๋‹ค. transformers.onnx์™€ ๋งค์šฐ ์œ ์‚ฌํ•˜๋‹ˆ ๊ฑฑ์ •ํ•˜์ง€ ๋งˆ์„ธ์š”!

๐Ÿค— Transformers ๋ชจ๋ธ์„ ONNX๋กœ ๋‚ด๋ณด๋‚ด๋ ค๋ฉด ๋จผ์ € ๋ช‡ ๊ฐ€์ง€ ์ถ”๊ฐ€ ์ข…์†์„ฑ์„ ์„ค์น˜ํ•ด์•ผํ•ฉ๋‹ˆ๋‹ค:

pip install transformers[onnx]

transformers.onnx ํŒจํ‚ค์ง€๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์ด Python ๋ชจ๋“ˆ๋กœ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

python -m transformers.onnx --help

usage: Hugging Face Transformers ONNX exporter [-h] -m MODEL [--feature {causal-lm, ...}] [--opset OPSET] [--atol ATOL] output

positional arguments:
  output                Path indicating where to store generated ONNX model.

optional arguments:
  -h, --help            show this help message and exit
  -m MODEL, --model MODEL
                        Model ID on huggingface.co or path on disk to load model from.
  --feature {causal-lm, ...}
                        The type of features to export the model with.
  --opset OPSET         ONNX opset version to export the model with.
  --atol ATOL           Absolute difference tolerance when validating the model.

๋‹ค์Œ๊ณผ ๊ฐ™์ด ๋ฏธ๋ฆฌ ์ œ์ž‘๋œ ๊ตฌ์„ฑ์„ ์‚ฌ์šฉํ•˜์—ฌ ์ฒดํฌํฌ์ธํŠธ๋ฅผ ๋‚ด๋ณด๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

python -m transformers.onnx --model=distilbert-base-uncased onnx/

๋‹ค์Œ๊ณผ ๊ฐ™์€ ๋กœ๊ทธ๊ฐ€ ํ‘œ์‹œ๋˜์–ด์•ผํ•ฉ๋‹ˆ๋‹ค:

Validating ONNX model...
        -[โœ“] ONNX model output names match reference model ({'last_hidden_state'})
        - Validating ONNX Model output "last_hidden_state":
                -[โœ“] (2, 8, 768) matches (2, 8, 768)
                -[โœ“] all values close (atol: 1e-05)
All good, model saved at: onnx/model.onnx

์ด๋ ‡๊ฒŒ --model ์ธ์ˆ˜๋กœ ์ •์˜๋œ ์ฒดํฌํฌ์ธํŠธ์˜ ONNX ๊ทธ๋ž˜ํ”„๋ฅผ ๋‚ด๋ณด๋ƒ…๋‹ˆ๋‹ค. ์˜ˆ์‹œ์—์„œ๋Š” distilbert-base-uncased์ด์ง€๋งŒ, Hugging Face Hub์—์„œ ๊ฐ€์ ธ์™”๊ฑฐ๋‚˜ ๋กœ์ปฌ์— ์ €์žฅ๋œ ์ฒดํฌํฌ์ธํŠธ๋“ค ๋ชจ๋‘ ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.

๊ฒฐ๊ณผ๋กœ ๋‚˜์˜จ model.onnx ํŒŒ์ผ์€ ONNX ํ‘œ์ค€์„ ์ง€์›ํ•˜๋Š” ๋‹ค์–‘ํ•œ ๊ฐ€์†๊ธฐ ์ค‘ ํ•˜๋‚˜์—์„œ ์‹คํ–‰ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, ๋‹ค์Œ๊ณผ ๊ฐ™์ด ONNX Runtime์—์„œ ๋ชจ๋ธ์„ ๊ฐ€์ ธ์˜ค๊ณ  ์‹คํ–‰ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

>>> from transformers import AutoTokenizer
>>> from onnxruntime import InferenceSession

>>> tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased")
>>> session = InferenceSession("onnx/model.onnx")
>>> # ONNX Runtime expects NumPy arrays as input
>>> inputs = tokenizer("Using DistilBERT with ONNX Runtime!", return_tensors="np")
>>> outputs = session.run(output_names=["last_hidden_state"], input_feed=dict(inputs))

["last_hidden_state"]์™€ ๊ฐ™์€ ํ•„์š”ํ•œ ์ถœ๋ ฅ ์ด๋ฆ„์€ ๊ฐ ๋ชจ๋ธ์˜ ONNX ๊ตฌ์„ฑ์„ ์‚ดํŽด๋ณด๋ฉด ์–ป์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, DistilBERT์˜ ๊ฒฝ์šฐ ๋‹ค์Œ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค:

>>> from transformers.models.distilbert import DistilBertConfig, DistilBertOnnxConfig

>>> config = DistilBertConfig()
>>> onnx_config = DistilBertOnnxConfig(config)
>>> print(list(onnx_config.outputs.keys()))
["last_hidden_state"]

Hub์˜ TensorFlow ์ฒดํฌํฌ์ธํŠธ์˜ ๊ฒฝ์šฐ์—๋„ ๊ณผ์ •์€ ๋™์ผํ•ฉ๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, ๋‹ค์Œ๊ณผ ๊ฐ™์ด Keras organization์—์„œ TensorFlow ์ฒดํฌํฌ์ธํŠธ๋ฅผ ๋‚ด๋ณด๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

python -m transformers.onnx --model=keras-io/transformers-qa onnx/

๋กœ์ปฌ์— ์ €์žฅ๋œ ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ด๋ ค๋ฉด ๋ชจ๋ธ์˜ ๊ฐ€์ค‘์น˜ ๋ฐ ํ† ํฌ๋‚˜์ด์ € ํŒŒ์ผ์ด ์ €์žฅ๋œ ๋””๋ ‰ํ† ๋ฆฌ๊ฐ€ ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, ๋‹ค์Œ๊ณผ ๊ฐ™์ด ์ฒดํฌํฌ์ธํŠธ๋ฅผ ๊ฐ€์ ธ์˜ค๊ณ  ์ €์žฅํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

>>> from transformers import AutoTokenizer, AutoModelForSequenceClassification

>>> # Load tokenizer and PyTorch weights form the Hub
>>> tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased")
>>> pt_model = AutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased")
>>> # Save to disk
>>> tokenizer.save_pretrained("local-pt-checkpoint")
>>> pt_model.save_pretrained("local-pt-checkpoint")

์ฒดํฌํฌ์ธํŠธ๋ฅผ ์ €์žฅํ•œ ํ›„, transformers.onnx ํŒจํ‚ค์ง€์˜ --model ์ธ์ˆ˜๋ฅผ ์›ํ•˜๋Š” ๋””๋ ‰ํ† ๋ฆฌ๋กœ ์ง€์ •ํ•˜์—ฌ ONNX๋กœ ๋‚ด๋ณด๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

python -m transformers.onnx --model=local-pt-checkpoint onnx/

>>> from transformers import AutoTokenizer, TFAutoModelForSequenceClassification

>>> # Load tokenizer and TensorFlow weights from the Hub
>>> tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased")
>>> tf_model = TFAutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased")
>>> # Save to disk
>>> tokenizer.save_pretrained("local-tf-checkpoint")
>>> tf_model.save_pretrained("local-tf-checkpoint")

์ฒดํฌํฌ์ธํŠธ๋ฅผ ์ €์žฅํ•œ ํ›„, transformers.onnx ํŒจํ‚ค์ง€์˜ --model ์ธ์ˆ˜๋ฅผ ์›ํ•˜๋Š” ๋””๋ ‰ํ† ๋ฆฌ๋กœ ์ง€์ •ํ•˜์—ฌ ONNX๋กœ ๋‚ด๋ณด๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

python -m transformers.onnx --model=local-tf-checkpoint onnx/

๋‹ค๋ฅธ ๋ชจ๋ธ ์ž‘์—…์— ๋Œ€ํ•œ ๊ธฐ๋Šฅ ์„ ํƒ[[selecting-features-for-different-model-tasks]]

์ด์ œ ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ผ ๋•Œ optimum.exporters.onnx๋ฅผ ์‚ฌ์šฉํ•˜๋„๋ก ๊ถŒ์žฅํ•ฉ๋‹ˆ๋‹ค. ์ž‘์—…์„ ์„ ํƒํ•˜๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ์•„๋ณด๋ ค๋ฉด ๐Ÿค— Optimum ๋ฌธ์„œ๋ฅผ ํ™•์ธํ•˜์„ธ์š”.

๋‹ค๋ฅธ ์œ ํ˜•์˜ ํƒœ์Šคํฌ์— ๋งž์ถฐ์„œ ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ผ ์ˆ˜ ์žˆ๋„๋ก ๋ฏธ๋ฆฌ ์ œ์ž‘๋œ ๊ตฌ์„ฑ๋งˆ๋‹ค ์ผ๋ จ์˜ _๊ธฐ๋Šฅ_์ด ํฌํ•จ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. ์•„๋ž˜ ํ‘œ์— ๋‚˜์™€ ์žˆ๋Š”๋Œ€๋กœ ๊ฐ ๊ธฐ๋Šฅ์€ ๋‹ค๋ฅธ AutoClass์™€ ์—ฐ๊ด€๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค.

Feature Auto Class
causal-lm, causal-lm-with-past AutoModelForCausalLM
default, default-with-past AutoModel
masked-lm AutoModelForMaskedLM
question-answering AutoModelForQuestionAnswering
seq2seq-lm, seq2seq-lm-with-past AutoModelForSeq2SeqLM
sequence-classification AutoModelForSequenceClassification
token-classification AutoModelForTokenClassification

๊ฐ ๊ตฌ์„ฑ์—์„œ [~transformers.onnx.FeaturesManager]๋ฅผ ํ†ตํ•ด ์ง€์›๋˜๋Š” ๊ธฐ๋Šฅ ๋ชฉ๋ก์„ ์ฐพ์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, DistilBERT์˜ ๊ฒฝ์šฐ ๋‹ค์Œ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค:

>>> from transformers.onnx.features import FeaturesManager

>>> distilbert_features = list(FeaturesManager.get_supported_features_for_model_type("distilbert").keys())
>>> print(distilbert_features)
["default", "masked-lm", "causal-lm", "sequence-classification", "token-classification", "question-answering"]

๊ทธ๋Ÿฐ ๋‹ค์Œ transformers.onnx ํŒจํ‚ค์ง€์˜ --feature ์ธ์ˆ˜์— ์ด๋Ÿฌํ•œ ๊ธฐ๋Šฅ ์ค‘ ํ•˜๋‚˜๋ฅผ ์ „๋‹ฌํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, ํ…์ŠคํŠธ ๋ถ„๋ฅ˜ ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ด๋ ค๋ฉด ๋‹ค์Œ๊ณผ ๊ฐ™์ด Hub์—์„œ ๋ฏธ์„ธ ์กฐ์ •๋œ ๋ชจ๋ธ์„ ์„ ํƒํ•˜๊ณ  ์‹คํ–‰ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

python -m transformers.onnx --model=distilbert-base-uncased-finetuned-sst-2-english \
                            --feature=sequence-classification onnx/

๋‹ค์Œ๊ณผ ๊ฐ™์€ ๋กœ๊ทธ๊ฐ€ ํ‘œ์‹œ๋ฉ๋‹ˆ๋‹ค:

Validating ONNX model...
        -[โœ“] ONNX model output names match reference model ({'logits'})
        - Validating ONNX Model output "logits":
                -[โœ“] (2, 2) matches (2, 2)
                -[โœ“] all values close (atol: 1e-05)
All good, model saved at: onnx/model.onnx

์ด๋•Œ ๋ฏธ์„ธ ์กฐ์ •๋œ ๋ชจ๋ธ์˜ ์ถœ๋ ฅ๋ช…์€ ์ด์ „์˜ distilbert-base-uncased ์ฒดํฌํฌ์ธํŠธ์—์„œ ๋ดค๋˜ last_hidden_state์™€ ๋‹ฌ๋ฆฌ logits์ž…๋‹ˆ๋‹ค. ์‹œํ€€์Šค ๋ถ„๋ฅ˜๋ฅผ ์œ„ํ•ด ๋ฏธ์„ธ ์กฐ์ •๋˜์—ˆ๊ธฐ ๋•Œ๋ฌธ์— ์˜ˆ์ƒ๋Œ€๋กœ ์ž…๋‹ˆ๋‹ค.

with-past ์ ‘๋ฏธ์‚ฌ๋ฅผ ๊ฐ€์ง„ ๊ธฐ๋Šฅ(์˜ˆ: causal-lm-with-past)์€ ๋ฏธ๋ฆฌ ๊ณ„์‚ฐ๋œ ์ˆจ๊ฒจ์ง„ ์ƒํƒœ(hidden states; ์–ดํ…์…˜ ๋ธ”๋ก ์† ํ‚ค-๊ฐ’ ์Œ)๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋น ๋ฅธ ์ž๊ธฐ ํšŒ๊ท€ ๋””์ฝ”๋”ฉ์ด ๊ฐ€๋Šฅํ•œ ๋ชจ๋ธ ํด๋ž˜์Šค๋“ค์ž…๋‹ˆ๋‹ค.

VisionEncoderDecoder ์œ ํ˜• ๋ชจ๋ธ์˜ ๊ฒฝ์šฐ, ์ธ์ฝ”๋” ๋ฐ ๋””์ฝ”๋” ๋ถ€๋ถ„์€ ๊ฐ๊ฐ encoder_model.onnx ๋ฐ decoder_model.onnx๋ผ๋Š” ๋‘ ๊ฐœ์˜ ONNX ํŒŒ์ผ๋กœ ๋ถ„๋ฆฌํ•˜์—ฌ ๋‚ด๋ณด๋ƒ…๋‹ˆ๋‹ค.

์ง€์›๋˜์ง€ ์•Š๋Š” ์•„ํ‚คํ…์ฒ˜๋ฅผ ์œ„ํ•œ ๋ชจ๋ธ ๋‚ด๋ณด๋‚ด๊ธฐ[[exporting-a-model-for-an-unsupported-architecture]]

ํ˜„์žฌ ๋‚ด๋ณด๋‚ผ ์ˆ˜ ์—†๋Š” ๋ชจ๋ธ์„ ์ง€์›ํ•˜๋„๋ก ๊ธฐ์—ฌํ•˜๋ ค๋ฉด ๋จผ์ € optimum.exporters.onnx์—์„œ ์ง€์›๋˜๋Š”์ง€ ํ™•์ธํ•˜๊ณ  ์ง€์›๋˜์ง€ ์•Š๋Š” ๊ฒฝ์šฐ ๐Ÿค— Optimum์— ๊ธฐ์—ฌํ•˜์„ธ์š”.

๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ์—์„œ ์ง์ ‘ ์ง€์›ํ•˜์ง€ ์•Š๋Š” ์•„ํ‚คํ…์ฒ˜์˜ ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ด๋ ค๋ฉด ์„ธ ๊ฐ€์ง€ ์ฃผ์š” ๋‹จ๊ณ„๋ฅผ ๊ฑฐ์ณ์•ผ ํ•ฉ๋‹ˆ๋‹ค:

  1. ์‚ฌ์šฉ์ž ์ •์˜ ONNX ๊ตฌ์„ฑ์„ ๊ตฌํ˜„ํ•˜๊ธฐ
  2. ๋ชจ๋ธ์„ ONNX๋กœ ๋‚ด๋ณด๋‚ด๊ธฐ
  3. PyTorch ๋ฐ ๋‚ด๋ณด๋‚ธ ๋ชจ๋ธ์˜ ์ถœ๋ ฅ ๊ฒ€์ฆํ•˜๊ธฐ

์ด ์„น์…˜์—์„œ๋Š” DistilBERT๊ฐ€ ์–ด๋–ป๊ฒŒ ๊ตฌํ˜„๋˜์—ˆ๋Š”์ง€ ๊ฐ ๋‹จ๊ณ„๋งˆ๋‹ค ์ž์„ธํžˆ ์‚ดํŽด๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค.

์‚ฌ์šฉ์ž ์ •์˜ ONNX ๊ตฌ์„ฑ์„ ๊ตฌํ˜„ํ•˜๊ธฐ[[implementing-a-custom-onnx-configuration]]

ONNX ๊ตฌ์„ฑ ๊ฐ์ฒด๋ถ€ํ„ฐ ์‹œ์ž‘ํ•ด ๋ด…์‹œ๋‹ค. ๋‚ด๋ณด๋‚ด๋ ค๋Š” ๋ชจ๋ธ ์•„ํ‚คํ…์ฒ˜ ์œ ํ˜•์— ๋”ฐ๋ผ ์ƒ์†ํ•ด์•ผํ•˜๋Š” ์„ธ ๊ฐ€์ง€ ์ถ”์ƒ ํด๋ž˜์Šค๋ฅผ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค:

  • ์ธ์ฝ”๋” ๊ธฐ๋ฐ˜ ๋ชจ๋ธ์€ [~onnx.config.OnnxConfig]๋ฅผ ์ƒ์†ํ•ฉ๋‹ˆ๋‹ค.
  • ๋””์ฝ”๋” ๊ธฐ๋ฐ˜ ๋ชจ๋ธ์€ [~onnx.config.OnnxConfigWithPast]๋ฅผ ์ƒ์†ํ•ฉ๋‹ˆ๋‹ค.
  • ์ธ์ฝ”๋”-๋””์ฝ”๋” ๋ชจ๋ธ์€ [~onnx.config.OnnxSeq2SeqConfigWithPast]๋ฅผ ์ƒ์†ํ•ฉ๋‹ˆ๋‹ค.

์‚ฌ์šฉ์ž ์ •์˜ ONNX ๊ตฌ์„ฑ์„ ๊ตฌํ˜„ํ•˜๋Š” ์ข‹์€ ๋ฐฉ๋ฒ•์€ ๋น„์Šทํ•œ ์•„ํ‚คํ…์ฒ˜์˜ configuration_<model_name>.py ํŒŒ์ผ์—์„œ ๊ธฐ์กด ๊ตฌํ˜„์„ ํ™•์ธํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค.

DistilBERT๋Š” ์ธ์ฝ”๋” ๊ธฐ๋ฐ˜ ๋ชจ๋ธ์ด๋ฏ€๋กœ ํ•ด๋‹น ๊ตฌ์„ฑ์€ OnnxConfig๋ฅผ ์ƒ์†ํ•ฉ๋‹ˆ๋‹ค.

>>> from typing import Mapping, OrderedDict
>>> from transformers.onnx import OnnxConfig


>>> class DistilBertOnnxConfig(OnnxConfig):
...     @property
...     def inputs(self) -> Mapping[str, Mapping[int, str]]:
...         return OrderedDict(
...             [
...                 ("input_ids", {0: "batch", 1: "sequence"}),
...                 ("attention_mask", {0: "batch", 1: "sequence"}),
...             ]
...         )

๊ฐ ๊ตฌ์„ฑ ๊ฐ์ฒด๋Š” inputs ์†์„ฑ์„ ๊ตฌํ˜„ํ•˜๊ณ  ๋งคํ•‘์„ ๋ฐ˜ํ™˜ํ•ด์•ผ ํ•ฉ๋‹ˆ๋‹ค. ๋งคํ•‘์˜ ํ‚ค๋Š” ์˜ˆ์ƒ ์ž…๋ ฅ์— ํ•ด๋‹นํ•˜๊ณ  ๊ฐ’์€ ํ•ด๋‹น ์ž…๋ ฅ์˜ ์ถ•์„ ๋‚˜ํƒ€๋ƒ…๋‹ˆ๋‹ค. DistilBERT์˜ ๊ฒฝ์šฐ input_ids ๋ฐ attention_mask ๋‘ ๊ฐœ์˜ ์ž…๋ ฅ์ด ํ•„์š”ํ•œ๋ฐ์š”. ๋‘ ์ž…๋ ฅ ๋ชจ๋‘ (batch_size, sequence_length)์˜ ๋™์ผํ•œ ์ฐจ์›์ด๊ธฐ ๋•Œ๋ฌธ์— ๊ตฌ์„ฑ์—์„œ๋„ ๋˜‘๊ฐ™์€ ์ถ•์„ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค.

DistilBertOnnxConfig์˜ inputs ์†์„ฑ์ด OrderedDict๋ผ๋Š” ๊ฒƒ์— ์œ ์˜ํ•˜์„ธ์š”. ์ด๋ ‡๊ฒŒ ํ•˜๋ฉด ์ž…๋ ฅ์ด ๊ทธ๋ž˜ํ”„๋ฅผ ๋”ฐ๋ผ ํ๋ฅผ ๋•Œ PreTrainedModel.forward() ๋ฉ”์†Œ๋“œ ์† ์•Œ๋งž์€ ์ƒ๋Œ€์ ์ธ ์œ„์น˜์— ์žˆ๋„๋ก ๋ณด์žฅํ•ฉ๋‹ˆ๋‹ค. ์‚ฌ์šฉ์ž ์ •์˜ ONNX ๊ตฌ์„ฑ์„ ๊ตฌํ˜„ํ•  ๋•Œ๋„ inputs ๋ฐ outputs ์†์„ฑ์œผ๋กœ OrderedDict๋ฅผ ์‚ฌ์šฉํ•˜๋Š” ๊ฒƒ์„ ๊ถŒ์žฅํ•ฉ๋‹ˆ๋‹ค.

ONNX ๊ตฌ์„ฑ์„ ๊ตฌํ˜„ํ•œ ํ›„์—๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์ด ๊ธฐ๋ณธ ๋ชจ๋ธ์˜ ๊ตฌ์„ฑ์„ ์ œ๊ณตํ•˜์—ฌ ์ธ์Šคํ„ด์Šคํ™” ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

>>> from transformers import AutoConfig

>>> config = AutoConfig.from_pretrained("distilbert-base-uncased")
>>> onnx_config = DistilBertOnnxConfig(config)

๊ฒฐ๊ณผ ๊ฐ์ฒด์—๋Š” ์—ฌ๋Ÿฌ ๊ฐ€์ง€ ์œ ์šฉํ•œ ์†์„ฑ์ด ์žˆ์Šต๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด ONNX๋กœ ๋‚ด๋ณด๋‚ผ ๋•Œ ์“ฐ์ผ ONNX ์—ฐ์‚ฐ์ž ์ง‘ํ•ฉ์„ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

>>> print(onnx_config.default_onnx_opset)
11

๋‹ค์Œ๊ณผ ๊ฐ™์ด ๋ชจ๋ธ์— ์—ฐ๊ฒฐ๋œ ์ถœ๋ ฅ์„ ๋ณผ ์ˆ˜๋„ ์žˆ์Šต๋‹ˆ๋‹ค:

>>> print(onnx_config.outputs)
OrderedDict([("last_hidden_state", {0: "batch", 1: "sequence"})])

์ถœ๋ ฅ ์†์„ฑ์ด ์ž…๋ ฅ๊ณผ ๋™์ผํ•œ ๊ตฌ์กฐ์ž„์„ ์œ ์˜ํ•˜์„ธ์š”. ๊ฐ ์ถœ๋ ฅ์€ ์ด๋ฆ„๊ณผ ์ฐจ์›์ด OrderedDict์˜ ํ‚ค-๊ฐ’์œผ๋กœ ์ €์žฅ๋˜์–ด ์žˆ์Šต๋‹ˆ๋‹ค. ์ถœ๋ ฅ ๊ตฌ์กฐ๋Š” ๊ตฌ์„ฑ์„ ์ดˆ๊ธฐํ™”ํ•  ๋•Œ ์„ ํƒํ•œ ๊ธฐ๋Šฅ๊ณผ ๊ด€๋ จ์ด ์žˆ์Šต๋‹ˆ๋‹ค. ๊ธฐ๋ณธ์ ์œผ๋กœ ONNX ๊ตฌ์„ฑ์€ AutoModel ํด๋ž˜์Šค๋กœ ๊ฐ€์ ธ์˜จ ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ผ ๋•Œ ์“ฐ์ด๋Š” default ๊ธฐ๋Šฅ์œผ๋กœ ์ดˆ๊ธฐํ™”๋ฉ๋‹ˆ๋‹ค. ๋‹ค๋ฅธ ํƒœ์Šคํฌ๋ฅผ ์œ„ํ•ด ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ด๋ ค๋ฉด ONNX ๊ตฌ์„ฑ์„ ์ดˆ๊ธฐํ™”ํ•  ๋•Œ task ์ธ์ˆ˜์— ๋‹ค๋ฅธ ๊ธฐ๋Šฅ์„ ๋„ฃ์œผ๋ฉด ๋ฉ๋‹ˆ๋‹ค. ์˜ˆ๋ฅผ ๋“ค์–ด, ์‹œํ€€์Šค ๋ถ„๋ฅ˜ ๋‹จ๊ณ„๋ฅผ ๋ง๋ถ™์ธ DistilBERT๋ฅผ ๋‚ด๋ณด๋‚ด๋ ค๋ฉด, ์ด๋ ‡๊ฒŒ ํ•ด๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

>>> from transformers import AutoConfig

>>> config = AutoConfig.from_pretrained("distilbert-base-uncased")
>>> onnx_config_for_seq_clf = DistilBertOnnxConfig(config, task="sequence-classification")
>>> print(onnx_config_for_seq_clf.outputs)
OrderedDict([('logits', {0: 'batch'})])

[~onnx.config.OnnxConfig]๋‚˜ ๋‹ค๋ฅธ ๊ตฌ์„ฑ ํด๋ž˜์Šค์— ์—ฐ๊ฒฐ๋œ ๋ชจ๋“  ๊ธฐ๋ณธ ์†์„ฑ ๋ฐ ๋ฉ”์†Œ๋“œ๋Š” ํ•„์š”์— ๋”ฐ๋ผ ๋ชจ๋‘ ์žฌ์ •์˜ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ๊ณ ๊ธ‰ ์˜ˆ์ œ๋กœ [BartOnnxConfig]๋ฅผ ํ™•์ธํ•˜์„ธ์š”.

๋ชจ๋ธ ๋‚ด๋ณด๋‚ด๊ธฐ[[exporting-the-model]]

ONNX ๊ตฌ์„ฑ์„ ๊ตฌํ˜„ํ–ˆ๋‹ค๋ฉด, ๋‹ค์Œ ๋‹จ๊ณ„๋Š” ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ด๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ์ด์ œ transformers.onnx ํŒจํ‚ค์ง€์—์„œ ์ œ๊ณตํ•˜๋Š” export() ํ•จ์ˆ˜๋ฅผ ์‚ดํŽด๋ณด๊ฒ ์Šต๋‹ˆ๋‹ค. ์ด ํ•จ์ˆ˜๋Š” ONNX ๊ตฌ์„ฑ, ๊ธฐ๋ณธ ๋ชจ๋ธ, ํ† ํฌ๋‚˜์ด์ €, ๊ทธ๋ฆฌ๊ณ  ๋‚ด๋ณด๋‚ผ ํŒŒ์ผ์˜ ๊ฒฝ๋กœ๋ฅผ ์ž…๋ ฅ์œผ๋กœ ๋ฐ›์Šต๋‹ˆ๋‹ค:

>>> from pathlib import Path
>>> from transformers.onnx import export
>>> from transformers import AutoTokenizer, AutoModel

>>> onnx_path = Path("model.onnx")
>>> model_ckpt = "distilbert-base-uncased"
>>> base_model = AutoModel.from_pretrained(model_ckpt)
>>> tokenizer = AutoTokenizer.from_pretrained(model_ckpt)

>>> onnx_inputs, onnx_outputs = export(tokenizer, base_model, onnx_config, onnx_config.default_onnx_opset, onnx_path)

export() ํ•จ์ˆ˜๊ฐ€ ๋ฐ˜ํ™˜ํ•˜๋Š” onnx_inputs์™€ onnx_outputs๋Š” ๊ตฌ์„ฑ์˜ inputs์™€ outputs ์†์„ฑ์—์„œ ์ •์˜๋œ ํ‚ค ๋ชฉ๋ก์ž…๋‹ˆ๋‹ค. ๋ชจ๋ธ์„ ๋‚ด๋ณด๋‚ธ ํ›„ ๋‹ค์Œ๊ณผ ๊ฐ™์ด ๋ชจ๋ธ์ด ์ž˜ ๊ตฌ์„ฑ๋˜์–ด ์žˆ๋Š”์ง€ ํ…Œ์ŠคํŠธํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

>>> import onnx

>>> onnx_model = onnx.load("model.onnx")
>>> onnx.checker.check_model(onnx_model)

๋ชจ๋ธ ํฌ๊ธฐ๊ฐ€ 2GB๋ณด๋‹ค ํฐ ๊ฒฝ์šฐ ๋‚ด๋ณด๋‚ด๋Š” ์ค‘์— ์—ฌ๋Ÿฌ ์ถ”๊ฐ€ ํŒŒ์ผ๋“ค์ด ์ƒ์„ฑ๋˜๋Š” ๊ฒƒ์„ ๋ณผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์‚ฌ์‹ค ONNX๋Š” ๋ชจ๋ธ์„ ์ €์žฅํ•˜๊ธฐ ์œ„ํ•ด Protocol Buffers๋ฅผ ์‚ฌ์šฉํ•˜๋Š”๋ฐ, ๋ฒ„ํผ๋Š” 2GB์˜ ํฌ๊ธฐ ์ œํ•œ์ด ์žˆ๊ธฐ ๋•Œ๋ฌธ์— ์ž์—ฐ์Šค๋Ÿฌ์šด ์ผ์ž…๋‹ˆ๋‹ค. ์™ธ๋ถ€ ๋ฐ์ดํ„ฐ๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๋ชจ๋ธ์„ ๊ฐ€์ ธ์˜ค๋Š” ๋ฐฉ๋ฒ•์€ ONNX ๋ฌธ์„œ๋ฅผ ์ฐธ์กฐํ•˜์„ธ์š”.

๋ชจ๋ธ์˜ ์ถœ๋ ฅ ๊ฒ€์ฆํ•˜๊ธฐ[[validating-the-model-outputs]]

๋งˆ์ง€๋ง‰ ๋‹จ๊ณ„๋Š” ๊ธฐ์กด ๋ชจ๋ธ๊ณผ ๋‚ด๋ณด๋‚ธ ๋ชจ๋ธ์˜ ์ถœ๋ ฅ์ด ์ผ์ •ํ•œ ์˜ค์ฐจ ๋ฒ”์œ„ ๋‚ด์—์„œ ๋™์ผํ•˜๋‹ค๋Š” ๊ฒƒ์„ ๊ฒ€์ฆํ•˜๋Š” ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ทธ๋Ÿฌ๋ ค๋ฉด transformers.onnx ํŒจํ‚ค์ง€์—์„œ ์ œ๊ณตํ•˜๋Š” validate_model_outputs() ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค:

>>> from transformers.onnx import validate_model_outputs

>>> validate_model_outputs(
...     onnx_config, tokenizer, base_model, onnx_path, onnx_outputs, onnx_config.atol_for_validation
... )

์ด ํ•จ์ˆ˜๋Š” [~transformers.onnx.OnnxConfig.generate_dummy_inputs] ๋ฉ”์†Œ๋“œ๋กœ ๊ธฐ์กด ๋ฐ ๋‚ด๋ณด๋‚ธ ๋ชจ๋ธ์˜ ์ž…๋ ฅ์„ ์ƒ์„ฑํ•˜๋ฉฐ, ๊ฒ€์ฆ์— ์‚ฌ์šฉ๋  ์˜ค์ฐจ ๋ฒ”์œ„๋Š” ๊ตฌ์„ฑ์—์„œ ์ •์˜ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ผ๋ฐ˜์ ์œผ๋กœ๋Š” 1e-6์—์„œ 1e-4 ๋ฒ”์œ„ ๋‚ด์—์„œ ํ•ฉ์˜ํ•˜์ง€๋งŒ, 1e-3๋ณด๋‹ค ์ž‘๋‹ค๋ฉด ๋ฌธ์ œ ์—†์„ ๊ฐ€๋Šฅ์„ฑ์ด ๋†’์Šต๋‹ˆ๋‹ค.

๐Ÿค— Transformers์— ์ƒˆ ๊ตฌ์„ฑ ์ถ”๊ฐ€ํ•˜๊ธฐ[[contributing-a-new-configuration-to-transformers]]

๋ฏธ๋ฆฌ ์ œ์ž‘๋œ ๊ตฌ์„ฑ์˜ ์ˆซ์ž๋ฅผ ๋Š˜๋ฆฌ๋ ค๊ณ  ๋…ธ๋ ฅํ•˜๊ณ  ์žˆ์œผ๋ฉฐ, ์ปค๋ฎค๋‹ˆํ‹ฐ์˜ ๊ธฐ์—ฌ๋ฅผ ํ™˜์˜ํ•ฉ๋‹ˆ๋‹ค! ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ์— ๋‹น์‹ ๋งŒ์˜ ๊ตฌ์„ฑ์„ ์ถ”๊ฐ€ํ•˜๋ ค๋ฉด ๋‹ค์Œ ๋‹จ๊ณ„๋ฅผ ๊ธฐ์–ตํ•ด์ฃผ์„ธ์š”:

  • configuration_<model_name>.py ํŒŒ์ผ์— ONNX ๊ตฌ์„ฑ์„ ๊ตฌํ˜„ํ•˜์„ธ์š”.
  • [~onnx.features.FeatureManager]์— ๋ชจ๋ธ ์•„ํ‚คํ…์ฒ˜ ๋ฐ ํ•ด๋‹น ๊ธฐ๋Šฅ์„ ํฌํ•จํ•˜์„ธ์š”.
  • test_onnx_v2.py์˜ ํ…Œ์ŠคํŠธ์— ๋ชจ๋ธ ์•„ํ‚คํ…์ฒ˜๋ฅผ ์ถ”๊ฐ€ํ•˜์„ธ์š”.

์•„์ง ๊ฐ์ด ์•ˆ ์žกํžˆ์‹ ๋‹ค๋ฉด, IBERT ๊ตฌ์„ฑ์ด ์–ด๋–ป๊ฒŒ ๊ธฐ์—ฌ๋˜์—ˆ๋Š”์ง€ ํ™•์ธํ•ด๋ณด์„ธ์š”.