--- license: llama2 language: - ko library_name: transformers --- --- license: llama2 language: - ko library_name: transformers base_model: beomi/llama-2-ko-7b pipeline_tag: text-generation --- # **msy127/ft_240201_01** ## Our Team | Research & Engineering | Product Management | | :--------------------: | :----------------: | | David Sohn | David Sohn | ## **Model Details** ### **Base Model** [beomi/llama-2-ko-7b](https://huggingface.co/beomi/llama-2-ko-7b) ### **Trained On** - **OS**: Ubuntu 22.04 - **GPU**: A100 40GB 1ea - **transformers**: v4.37 ### **Instruction format** It follows **Custom** format. E.g. ```python text = """\ <|user|> 건강한 식습관을 만들기 위해서는 어떻게 하는것이 좋을까요? <|assistant|> """ ``` ## **Implementation Code** This model contains the chat_template instruction format. You can use the code below. ```python # Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="msy127/ft_240201_01") # Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("msy127/ft_240201_01") model = AutoModelForCausalLM.from_pretrained("msy127/ft_240201_01") ``` ## **Introduction to our service platform** - AI Companion service platform that talks while looking at your face. - You can preview the future of the world's best, character.ai. - https://livetalkingai.com