DeiT_S16_RF_Spectrogram
This model is a fine-tuned version of facebook/deit-small-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0669
- Accuracy: 0.9882
- Precision: 0.9926
- Recall: 0.9817
- F1: 0.9871
- Tp: 1608
- Tn: 1898
- Fp: 12
- Fn: 30
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1107
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | Tp | Tn | Fp | Fn |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.4745 | 0.2477 | 110 | 0.1945 | 0.9340 | 0.9875 | 0.8681 | 0.9240 | 1422 | 1892 | 18 | 216 |
| 0.1543 | 0.4955 | 220 | 0.1073 | 0.9684 | 1.0 | 0.9316 | 0.9646 | 1526 | 1910 | 0 | 112 |
| 0.0972 | 0.7432 | 330 | 0.0959 | 0.9791 | 0.9943 | 0.9603 | 0.9770 | 1573 | 1901 | 9 | 65 |
| 0.0967 | 0.9910 | 440 | 0.0601 | 0.9859 | 0.9969 | 0.9725 | 0.9845 | 1593 | 1905 | 5 | 45 |
| 0.0763 | 1.2387 | 550 | 0.0586 | 0.9856 | 0.9969 | 0.9719 | 0.9842 | 1592 | 1905 | 5 | 46 |
| 0.0634 | 1.4865 | 660 | 0.0935 | 0.9755 | 0.9630 | 0.9847 | 0.9737 | 1613 | 1848 | 62 | 25 |
| 0.0854 | 1.7342 | 770 | 0.0607 | 0.9848 | 0.9919 | 0.9750 | 0.9834 | 1597 | 1897 | 13 | 41 |
| 0.0834 | 1.9820 | 880 | 0.0684 | 0.9828 | 0.9852 | 0.9774 | 0.9813 | 1601 | 1886 | 24 | 37 |
| 0.0545 | 2.2297 | 990 | 0.0705 | 0.9831 | 0.9981 | 0.9652 | 0.9814 | 1581 | 1907 | 3 | 57 |
| 0.0728 | 2.4775 | 1100 | 0.0594 | 0.9845 | 0.9962 | 0.9701 | 0.9830 | 1589 | 1904 | 6 | 49 |
| 0.0781 | 2.7252 | 1210 | 0.0571 | 0.9868 | 0.9994 | 0.9719 | 0.9855 | 1592 | 1909 | 1 | 46 |
| 0.0686 | 2.9730 | 1320 | 0.0583 | 0.9868 | 0.9901 | 0.9811 | 0.9856 | 1607 | 1894 | 16 | 31 |
| 0.0683 | 3.2207 | 1430 | 0.0486 | 0.9890 | 0.9994 | 0.9768 | 0.9880 | 1600 | 1909 | 1 | 38 |
| 0.0505 | 3.4685 | 1540 | 0.0539 | 0.9865 | 0.9932 | 0.9774 | 0.9852 | 1601 | 1899 | 11 | 37 |
| 0.0610 | 3.7162 | 1650 | 0.0559 | 0.9882 | 0.9981 | 0.9762 | 0.9870 | 1599 | 1907 | 3 | 39 |
| 0.0629 | 3.9640 | 1760 | 0.0432 | 0.9901 | 0.9975 | 0.9811 | 0.9892 | 1607 | 1906 | 4 | 31 |
| 0.0525 | 4.2117 | 1870 | 0.0465 | 0.9896 | 0.9994 | 0.9780 | 0.9886 | 1602 | 1909 | 1 | 36 |
| 0.0464 | 4.4595 | 1980 | 0.0502 | 0.9887 | 0.9950 | 0.9805 | 0.9877 | 1606 | 1902 | 8 | 32 |
| 0.0543 | 4.7072 | 2090 | 0.0454 | 0.9896 | 0.9969 | 0.9805 | 0.9886 | 1606 | 1905 | 5 | 32 |
| 0.0495 | 4.9550 | 2200 | 0.0453 | 0.9904 | 0.9969 | 0.9823 | 0.9895 | 1609 | 1905 | 5 | 29 |
| 0.0400 | 5.2027 | 2310 | 0.0512 | 0.9890 | 0.9957 | 0.9805 | 0.9880 | 1606 | 1903 | 7 | 32 |
| 0.0339 | 5.4505 | 2420 | 0.0514 | 0.9893 | 0.9963 | 0.9805 | 0.9883 | 1606 | 1904 | 6 | 32 |
| 0.0356 | 5.6982 | 2530 | 0.0449 | 0.9899 | 0.9963 | 0.9817 | 0.9889 | 1608 | 1904 | 6 | 30 |
| 0.0363 | 5.9459 | 2640 | 0.0521 | 0.9859 | 0.9824 | 0.9872 | 0.9848 | 1617 | 1881 | 29 | 21 |
| 0.0273 | 6.1937 | 2750 | 0.0499 | 0.9893 | 0.9914 | 0.9853 | 0.9884 | 1614 | 1896 | 14 | 24 |
| 0.0236 | 6.4414 | 2860 | 0.0542 | 0.9884 | 0.9920 | 0.9829 | 0.9874 | 1610 | 1897 | 13 | 28 |
| 0.0313 | 6.6892 | 2970 | 0.0440 | 0.9910 | 0.9981 | 0.9823 | 0.9902 | 1609 | 1907 | 3 | 29 |
| 0.0298 | 6.9369 | 3080 | 0.0511 | 0.9913 | 0.9981 | 0.9829 | 0.9905 | 1610 | 1907 | 3 | 28 |
| 0.0193 | 7.1847 | 3190 | 0.0531 | 0.9887 | 0.9920 | 0.9835 | 0.9877 | 1611 | 1897 | 13 | 27 |
| 0.0245 | 7.4324 | 3300 | 0.0529 | 0.9893 | 0.9938 | 0.9829 | 0.9883 | 1610 | 1900 | 10 | 28 |
| 0.0234 | 7.6802 | 3410 | 0.0640 | 0.9882 | 0.9920 | 0.9823 | 0.9871 | 1609 | 1897 | 13 | 29 |
| 0.0172 | 7.9279 | 3520 | 0.0636 | 0.9896 | 0.9963 | 0.9811 | 0.9886 | 1607 | 1904 | 6 | 31 |
| 0.0231 | 8.1757 | 3630 | 0.0559 | 0.9890 | 0.9932 | 0.9829 | 0.9880 | 1610 | 1899 | 11 | 28 |
| 0.0148 | 8.4234 | 3740 | 0.0566 | 0.9893 | 0.9932 | 0.9835 | 0.9883 | 1611 | 1899 | 11 | 27 |
| 0.0145 | 8.6712 | 3850 | 0.0601 | 0.9890 | 0.9914 | 0.9847 | 0.9881 | 1613 | 1896 | 14 | 25 |
| 0.0193 | 8.9189 | 3960 | 0.0634 | 0.9899 | 0.9951 | 0.9829 | 0.9889 | 1610 | 1902 | 8 | 28 |
| 0.0139 | 9.1667 | 4070 | 0.0647 | 0.9896 | 0.9957 | 0.9817 | 0.9886 | 1608 | 1903 | 7 | 30 |
| 0.0089 | 9.4144 | 4180 | 0.0678 | 0.9893 | 0.9950 | 0.9817 | 0.9883 | 1608 | 1902 | 8 | 30 |
| 0.0087 | 9.6622 | 4290 | 0.0664 | 0.9893 | 0.9950 | 0.9817 | 0.9883 | 1608 | 1902 | 8 | 30 |
| 0.0104 | 9.9099 | 4400 | 0.0669 | 0.9882 | 0.9926 | 0.9817 | 0.9871 | 1608 | 1898 | 12 | 30 |
Framework versions
- Transformers 5.0.0
- Pytorch 2.10.0+cu128
- Datasets 4.0.0
- Tokenizers 0.22.2
- Downloads last month
- 7
Model tree for YasserAlrayes/DeiT_S16_RF_Spectrogram
Base model
facebook/deit-small-patch16-224