File size: 1,386 Bytes
165c753 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
Quantization made by Richard Erkhov.
[Github](https://github.com/RichardErkhov)
[Discord](https://discord.gg/pvy7H8DZMG)
[Request more models](https://github.com/RichardErkhov/quant_request)
InstructProtein - AWQ
- Model creator: https://huggingface.co/hicai-zju/
- Original model: https://huggingface.co/hicai-zju/InstructProtein/
Original model description:
---
license: mit
---
# InstructProtein
InstructProtein is the first large generative language model exploring the feasibility of bidirectional generation between human and protein language.
It is based on OPT-1.3B architecture with two-step training approach: It initiates with pre-training on protein and natural language corpora, followed by fine-tuning with the established protein knowledge instruction dataset.
Through further instruction tuning, InstructProtein outperforms larger general-purpose foundation models on protein understanding and design tasks.
## Limitations
The current model, developed through instruction tuning using knowledge instruction dataset, serves as a preliminary example.
Despite its initial success in controlled environments, it lacks the robustness to manage complex, real-world, production-level tasks.
## Reference
For more information, please take a look at our [paper](https://arxiv.org/abs/2310.03269) and [repository](https://github.com/HICAI-ZJU/InstructProtein).
|