|
|
--- |
|
|
license: apache-2.0 |
|
|
library_name: "transformers.js" |
|
|
base_model: Felladrin/Pythia-31M-Chat-v1 |
|
|
language: |
|
|
- en |
|
|
datasets: |
|
|
- totally-not-an-llm/EverythingLM-data-V3 |
|
|
- databricks/databricks-dolly-15k |
|
|
- THUDM/webglm-qa |
|
|
- starfishmedical/webGPT_x_dolly |
|
|
- Amod/mental_health_counseling_conversations |
|
|
- sablo/oasst2_curated |
|
|
- cognitivecomputations/wizard_vicuna_70k_unfiltered |
|
|
- mlabonne/chatml_dpo_pairs |
|
|
--- |
|
|
|
|
|
INT8 ONNX version of [Felladrin/Pythia-31M-Chat-v1](https://huggingface.co/Felladrin/Pythia-31M-Chat-v1) to use with [Transformers.js](https://huggingface.co/docs/transformers.js). |
|
|
|