KoEn-Translation-Transformer-v0.1

Full Training and Dataset Preprocessing codes:

Check out my github: https://github.com/Hyeongmin-Cho/Transformer-from-Scratch-in-Pytorch

Due to limited computing resources, the model did not reach its optimal point during training. However, its performance is satisfactory. If further improvement is required, continual training could be considered.

Translation Example

- ์ž…๋ ฅ ํ•œ๊ตญ์–ด: ์œ ๊ฐ์Šค๋Ÿฝ๊ฒŒ๋„ 2018๋…„ 1์›” 9์ผ์— ์ œ๊ฐ€ ๋ฐ›์€ ์ฒ™์ถ” ์ˆ˜์ˆ ๊ณผ ๊ด€๋ จ๋œ ๋ณ‘์›๋น„๋ฅผ ์ œ๊ฐ€ ๋‚ฉ๋ถ€ํ•  ๋Šฅ๋ ฅ์ด ์—†์Œ์„ ์•Œ๋ ค ๋“œ๋ฆฌ๊ณ ์ž ์ด ํŽธ์ง€๋ฅผ ์”๋‹ˆ๋‹ค.
- ์ •๋‹ต ๋ฒˆ์—ญ: I am writing to inform you that unfortunately I am unable to pay the medical bill associated with the back surgery I received on January 9, 2018.
- ๋ชจ๋ธ ๋ฒˆ์—ญ: I'm writing this letter to inform you that I have no ability to pay for the hospital expenses related to my spinal surgery on January 9, 2018.
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Dataset used to train Lowenzahn/KoEn-Translation-Transformer-v0.1