c14kevincardenas's picture
Model save
30c50a9 verified
|
raw
history blame
2.69 kB
metadata
library_name: transformers
license: apache-2.0
base_model: c14kevincardenas/beit-large-patch16-384-limb
tags:
  - generated_from_trainer
model-index:
  - name: limbxy_pose_4heads_1layers_16embeddim
    results: []

limbxy_pose_4heads_1layers_16embeddim

This model is a fine-tuned version of c14kevincardenas/beit-large-patch16-384-limb on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1451
  • Rmse: 0.3810

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 2014
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 250
  • num_epochs: 20.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rmse
0.2128 1.0 89 0.1745 0.4177
0.1574 2.0 178 0.1486 0.3855
0.2045 3.0 267 0.1519 0.3897
0.1697 4.0 356 0.1632 0.4040
0.1818 5.0 445 0.1949 0.4414
0.1624 6.0 534 0.1475 0.3841
0.1645 7.0 623 0.1484 0.3852
0.1655 8.0 712 0.1471 0.3835
0.1594 9.0 801 0.1535 0.3918
0.1513 10.0 890 0.1449 0.3806
0.1488 11.0 979 0.1455 0.3814
0.1507 12.0 1068 0.1536 0.3919
0.1522 13.0 1157 0.1449 0.3807
0.1458 14.0 1246 0.1453 0.3811
0.1506 15.0 1335 0.1455 0.3815
0.1505 16.0 1424 0.1452 0.3810
0.1463 17.0 1513 0.1449 0.3807
0.1463 18.0 1602 0.1449 0.3806
0.1494 19.0 1691 0.1457 0.3816
0.1454 20.0 1780 0.1451 0.3810

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.5.0+cu124
  • Datasets 3.0.1
  • Tokenizers 0.20.1