|
--- |
|
library_name: transformers |
|
license: mit |
|
datasets: |
|
- duohub-ai/facts-extraction |
|
language: |
|
- en |
|
base_model: |
|
- meta-llama/Llama-3.1-8B-Instruct |
|
--- |
|
|
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. |
|
|
|
- **Developed by:** Oseh Mathias & duohub |
|
- **Funded by:** duohub |
|
- **Shared by:** duohub |
|
- **Finetuned from model:** `meta-llama/Llama-3.1-8B-Instruct` |
|
|
|
|
|
 |
|
|
|
## Uses |
|
|
|
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> |
|
|
|
This adapter allows you to use the `meta-llama/Llama-3.1-8B-Instruct` model to generate facts from content. |
|
|
|
Pass in an entire page (up to the 8192 token context window) and the model will return facts as a single sentence per line. |
|
|
|
|