Update README.md
Browse files
README.md
CHANGED
@@ -21,13 +21,24 @@ This model is a fine-tuned version of [microsoft/wavlm-base](https://huggingface
|
|
21 |
It achieves the following results on the evaluation set:
|
22 |
- Loss: 0.0593
|
23 |
- Accuracy: 0.9896
|
24 |
-
-
|
25 |
-
-
|
26 |
-
-
|
27 |
|
28 |
## Model description
|
29 |
|
30 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
31 |
|
32 |
## Intended uses & limitations
|
33 |
|
@@ -54,7 +65,7 @@ The following hyperparameters were used during training:
|
|
54 |
|
55 |
### Training results
|
56 |
|
57 |
-
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|
58 |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:------:|:------:|
|
59 |
| 0.3205 | 0.39 | 2500 | 0.1223 | 0.9699 | 0.0343 | 0.0229 | 0.0286 |
|
60 |
| 0.0752 | 0.79 | 5000 | 0.0822 | 0.9843 | 0.0145 | 0.0178 | 0.0161 |
|
|
|
21 |
It achieves the following results on the evaluation set:
|
22 |
- Loss: 0.0593
|
23 |
- Accuracy: 0.9896
|
24 |
+
- FAR: 0.0080
|
25 |
+
- FRR: 0.0144
|
26 |
+
- EER: 0.0112
|
27 |
|
28 |
## Model description
|
29 |
|
30 |
+
### Quick Use
|
31 |
+
|
32 |
+
```python
|
33 |
+
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
|
34 |
+
|
35 |
+
config = AutoConfig.from_pretrained("abhishtagatya/hubert-base-960h-itw-deepfake")
|
36 |
+
feature_extractor = Wav2Vec2FeatureExtractor.from_pretrained("abhishtagatya/hubert-base-960h-itw-deepfake")
|
37 |
+
|
38 |
+
model = HubertForSequenceClassification.from_pretrained("abhishtagatya/hubert-base-960h-itw-deepfake", config=config).to(device)
|
39 |
+
|
40 |
+
# Your Logic Here
|
41 |
+
```
|
42 |
|
43 |
## Intended uses & limitations
|
44 |
|
|
|
65 |
|
66 |
### Training results
|
67 |
|
68 |
+
| Training Loss | Epoch | Step | Validation Loss | Accuracy | FAR | FRR | EER |
|
69 |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:------:|:------:|
|
70 |
| 0.3205 | 0.39 | 2500 | 0.1223 | 0.9699 | 0.0343 | 0.0229 | 0.0286 |
|
71 |
| 0.0752 | 0.79 | 5000 | 0.0822 | 0.9843 | 0.0145 | 0.0178 | 0.0161 |
|