Update README.md
Browse files
README.md
CHANGED
@@ -29,6 +29,29 @@ Fin-RWKV is a cutting-edge, attention-free model designed specifically for finan
|
|
29 |
- Finance-Specific Training: Trained on the gbharti/finance-alpaca dataset, ensuring that the model is finely tuned for financial data analysis.
|
30 |
- Transformers Library Integration: Built on the popular 'transformers' library, ensuring easy integration with existing ML pipelines and applications.
|
31 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
32 |
## Competing Against
|
33 |
| Name | Param Count | Cost | Inference Cost |
|
34 |
|---------------|-------------|------|----------------|
|
|
|
29 |
- Finance-Specific Training: Trained on the gbharti/finance-alpaca dataset, ensuring that the model is finely tuned for financial data analysis.
|
30 |
- Transformers Library Integration: Built on the popular 'transformers' library, ensuring easy integration with existing ML pipelines and applications.
|
31 |
|
32 |
+
## How to use
|
33 |
+
```py
|
34 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM, StoppingCriteria, StoppingCriteriaList, TextIteratorStreamer
|
35 |
+
from threading import Thread
|
36 |
+
import torch
|
37 |
+
|
38 |
+
tokenizer = AutoTokenizer.from_pretrained("umuthopeyildirim/fin-rwkv-1b5")
|
39 |
+
model = AutoModelForCausalLM.from_pretrained("umuthopeyildirim/fin-rwkv-1b5")
|
40 |
+
|
41 |
+
prompt = "user: Is this headline positive or negative? Headline: Australian Tycoon Forrest Shuts Nickel Mines After Prices Crash\nbot:"
|
42 |
+
|
43 |
+
# Tokenize the input
|
44 |
+
input_ids = tokenizer.encode(prompt, return_tensors="pt")
|
45 |
+
|
46 |
+
# Generate a response
|
47 |
+
output = model.generate(input_ids, max_length=333, num_return_sequences=1)
|
48 |
+
|
49 |
+
# Decode the output
|
50 |
+
generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
|
51 |
+
|
52 |
+
print(generated_text)
|
53 |
+
```
|
54 |
+
|
55 |
## Competing Against
|
56 |
| Name | Param Count | Cost | Inference Cost |
|
57 |
|---------------|-------------|------|----------------|
|