Training in progress, epoch 25
Browse files
README.md
CHANGED
|
@@ -16,29 +16,29 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
-
- Loss: 1.
|
| 20 |
-
- Map: 0.
|
| 21 |
-
- Map 50: 0.
|
| 22 |
-
- Map 75: 0.
|
| 23 |
-
- Map Small: 0.
|
| 24 |
-
- Map Medium: 0.
|
| 25 |
-
- Map Large: 0.
|
| 26 |
-
- Mar 1: 0.
|
| 27 |
-
- Mar 10: 0.
|
| 28 |
-
- Mar 100: 0.
|
| 29 |
-
- Mar Small: 0.
|
| 30 |
-
- Mar Medium: 0.
|
| 31 |
-
- Mar Large: 0.
|
| 32 |
-
- Map Coverall: 0.
|
| 33 |
-
- Mar 100 Coverall: 0.
|
| 34 |
-
- Map Face Shield: 0.
|
| 35 |
-
- Mar 100 Face Shield: 0.
|
| 36 |
-
- Map Gloves: 0.
|
| 37 |
-
- Mar 100 Gloves: 0.
|
| 38 |
-
- Map Goggles: 0.
|
| 39 |
-
- Mar 100 Goggles: 0.
|
| 40 |
-
- Map Mask: 0.
|
| 41 |
-
- Mar 100 Mask: 0.
|
| 42 |
|
| 43 |
## Model description
|
| 44 |
|
|
@@ -58,57 +58,47 @@ More information needed
|
|
| 58 |
|
| 59 |
The following hyperparameters were used during training:
|
| 60 |
- learning_rate: 0.0001
|
| 61 |
-
- train_batch_size:
|
| 62 |
- eval_batch_size: 8
|
| 63 |
- seed: 42
|
| 64 |
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 65 |
- lr_scheduler_type: cosine
|
| 66 |
-
- num_epochs:
|
| 67 |
|
| 68 |
### Training results
|
| 69 |
|
| 70 |
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|
| 71 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
|
| 72 |
-
| No log | 1.0 |
|
| 73 |
-
| No log | 2.0 |
|
| 74 |
-
| No log | 3.0 |
|
| 75 |
-
| No log | 4.0 |
|
| 76 |
-
|
|
| 77 |
-
|
|
| 78 |
-
|
|
| 79 |
-
|
|
| 80 |
-
|
|
| 81 |
-
|
|
| 82 |
-
|
|
| 83 |
-
|
|
| 84 |
-
|
|
| 85 |
-
|
|
| 86 |
-
|
|
| 87 |
-
|
|
| 88 |
-
|
|
| 89 |
-
|
|
| 90 |
-
|
|
| 91 |
-
|
|
| 92 |
-
|
|
| 93 |
-
|
|
| 94 |
-
|
|
| 95 |
-
|
|
| 96 |
-
|
|
| 97 |
-
|
|
| 98 |
-
|
|
| 99 |
-
|
|
| 100 |
-
|
|
| 101 |
-
|
|
| 102 |
-
| 1.0318 | 31.0 | 3317 | 1.2156 | 0.2209 | 0.4441 | 0.1993 | 0.0791 | 0.1578 | 0.3472 | 0.2529 | 0.4194 | 0.4474 | 0.1979 | 0.3992 | 0.6193 | 0.5013 | 0.655 | 0.0989 | 0.4696 | 0.1429 | 0.3594 | 0.0961 | 0.3662 | 0.2652 | 0.3871 |
|
| 103 |
-
| 1.0318 | 32.0 | 3424 | 1.2147 | 0.2211 | 0.4316 | 0.1987 | 0.0668 | 0.1504 | 0.3481 | 0.2537 | 0.4163 | 0.4448 | 0.205 | 0.382 | 0.6173 | 0.5032 | 0.6541 | 0.0977 | 0.4557 | 0.1396 | 0.358 | 0.0939 | 0.3646 | 0.2708 | 0.3916 |
|
| 104 |
-
| 0.9599 | 33.0 | 3531 | 1.2336 | 0.2218 | 0.4479 | 0.1911 | 0.0803 | 0.1549 | 0.3424 | 0.2564 | 0.4105 | 0.4392 | 0.2128 | 0.3708 | 0.6056 | 0.4991 | 0.6464 | 0.1046 | 0.4646 | 0.1411 | 0.3585 | 0.1078 | 0.3415 | 0.2564 | 0.3849 |
|
| 105 |
-
| 0.9599 | 34.0 | 3638 | 1.2274 | 0.2236 | 0.4459 | 0.1906 | 0.0666 | 0.1573 | 0.3478 | 0.2603 | 0.4118 | 0.4427 | 0.2067 | 0.3793 | 0.6053 | 0.4981 | 0.6446 | 0.1009 | 0.4722 | 0.1484 | 0.3634 | 0.1046 | 0.3446 | 0.2661 | 0.3889 |
|
| 106 |
-
| 0.9599 | 35.0 | 3745 | 1.2208 | 0.2287 | 0.4525 | 0.2052 | 0.0667 | 0.1585 | 0.3532 | 0.2591 | 0.4148 | 0.4436 | 0.2129 | 0.3856 | 0.6051 | 0.5023 | 0.6527 | 0.1165 | 0.4734 | 0.1467 | 0.3576 | 0.105 | 0.3415 | 0.2728 | 0.3929 |
|
| 107 |
-
| 0.9599 | 36.0 | 3852 | 1.2175 | 0.2275 | 0.451 | 0.199 | 0.0646 | 0.1595 | 0.3539 | 0.2602 | 0.4195 | 0.446 | 0.2084 | 0.3818 | 0.6175 | 0.5043 | 0.6572 | 0.1109 | 0.4759 | 0.1483 | 0.358 | 0.1018 | 0.3477 | 0.2721 | 0.3911 |
|
| 108 |
-
| 0.9599 | 37.0 | 3959 | 1.2185 | 0.2292 | 0.4538 | 0.2058 | 0.0672 | 0.1612 | 0.3539 | 0.2595 | 0.416 | 0.4454 | 0.2179 | 0.3782 | 0.6129 | 0.5057 | 0.6527 | 0.112 | 0.4785 | 0.1493 | 0.3567 | 0.1044 | 0.3477 | 0.2744 | 0.3916 |
|
| 109 |
-
| 0.9178 | 38.0 | 4066 | 1.2171 | 0.2309 | 0.4527 | 0.2059 | 0.0721 | 0.1613 | 0.3576 | 0.26 | 0.4205 | 0.448 | 0.2191 | 0.3836 | 0.6146 | 0.5073 | 0.6541 | 0.1139 | 0.4823 | 0.1503 | 0.358 | 0.1076 | 0.3523 | 0.2755 | 0.3933 |
|
| 110 |
-
| 0.9178 | 39.0 | 4173 | 1.2159 | 0.2301 | 0.4528 | 0.2059 | 0.0702 | 0.1602 | 0.3559 | 0.2594 | 0.4186 | 0.4469 | 0.2224 | 0.382 | 0.6107 | 0.5073 | 0.6541 | 0.1112 | 0.4823 | 0.1503 | 0.3576 | 0.1075 | 0.3492 | 0.274 | 0.3911 |
|
| 111 |
-
| 0.9178 | 40.0 | 4280 | 1.2162 | 0.2299 | 0.4521 | 0.2056 | 0.0702 | 0.1582 | 0.3564 | 0.2597 | 0.4191 | 0.447 | 0.2188 | 0.3822 | 0.6116 | 0.5071 | 0.6545 | 0.1107 | 0.4823 | 0.1505 | 0.358 | 0.1074 | 0.3492 | 0.2739 | 0.3911 |
|
| 112 |
|
| 113 |
|
| 114 |
### Framework versions
|
|
|
|
| 16 |
|
| 17 |
This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co/microsoft/conditional-detr-resnet-50) on an unknown dataset.
|
| 18 |
It achieves the following results on the evaluation set:
|
| 19 |
+
- Loss: 1.1898
|
| 20 |
+
- Map: 0.2856
|
| 21 |
+
- Map 50: 0.5806
|
| 22 |
+
- Map 75: 0.245
|
| 23 |
+
- Map Small: 0.0813
|
| 24 |
+
- Map Medium: 0.224
|
| 25 |
+
- Map Large: 0.4449
|
| 26 |
+
- Mar 1: 0.2913
|
| 27 |
+
- Mar 10: 0.425
|
| 28 |
+
- Mar 100: 0.4382
|
| 29 |
+
- Mar Small: 0.1883
|
| 30 |
+
- Mar Medium: 0.3741
|
| 31 |
+
- Mar Large: 0.6152
|
| 32 |
+
- Map Coverall: 0.5355
|
| 33 |
+
- Mar 100 Coverall: 0.6414
|
| 34 |
+
- Map Face Shield: 0.2763
|
| 35 |
+
- Mar 100 Face Shield: 0.4582
|
| 36 |
+
- Map Gloves: 0.1854
|
| 37 |
+
- Mar 100 Gloves: 0.3411
|
| 38 |
+
- Map Goggles: 0.1443
|
| 39 |
+
- Mar 100 Goggles: 0.3585
|
| 40 |
+
- Map Mask: 0.2865
|
| 41 |
+
- Mar 100 Mask: 0.3916
|
| 42 |
|
| 43 |
## Model description
|
| 44 |
|
|
|
|
| 58 |
|
| 59 |
The following hyperparameters were used during training:
|
| 60 |
- learning_rate: 0.0001
|
| 61 |
+
- train_batch_size: 16
|
| 62 |
- eval_batch_size: 8
|
| 63 |
- seed: 42
|
| 64 |
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 65 |
- lr_scheduler_type: cosine
|
| 66 |
+
- num_epochs: 30
|
| 67 |
|
| 68 |
### Training results
|
| 69 |
|
| 70 |
| Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|
| 71 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
|
| 72 |
+
| No log | 1.0 | 54 | 1.2594 | 0.2476 | 0.5256 | 0.195 | 0.0552 | 0.1779 | 0.3989 | 0.2551 | 0.3893 | 0.4093 | 0.1584 | 0.3447 | 0.5932 | 0.5082 | 0.6396 | 0.2025 | 0.4228 | 0.1483 | 0.2982 | 0.1492 | 0.3462 | 0.2297 | 0.3396 |
|
| 73 |
+
| No log | 2.0 | 108 | 1.2779 | 0.241 | 0.5088 | 0.202 | 0.0586 | 0.1811 | 0.4005 | 0.2566 | 0.3858 | 0.4063 | 0.1663 | 0.3439 | 0.5746 | 0.4765 | 0.618 | 0.2332 | 0.4114 | 0.1375 | 0.3018 | 0.1465 | 0.3508 | 0.2113 | 0.3498 |
|
| 74 |
+
| No log | 3.0 | 162 | 1.2853 | 0.2403 | 0.5054 | 0.1907 | 0.0507 | 0.1788 | 0.4011 | 0.2512 | 0.3883 | 0.4091 | 0.1897 | 0.3331 | 0.5971 | 0.5027 | 0.618 | 0.2249 | 0.4127 | 0.1381 | 0.3103 | 0.1145 | 0.3492 | 0.2214 | 0.3551 |
|
| 75 |
+
| No log | 4.0 | 216 | 1.2714 | 0.248 | 0.5293 | 0.2106 | 0.065 | 0.1783 | 0.4074 | 0.2547 | 0.3797 | 0.4001 | 0.1561 | 0.3308 | 0.5725 | 0.4959 | 0.6207 | 0.2072 | 0.3709 | 0.1526 | 0.3214 | 0.124 | 0.3277 | 0.2606 | 0.3596 |
|
| 76 |
+
| No log | 5.0 | 270 | 1.2657 | 0.2388 | 0.502 | 0.1827 | 0.0794 | 0.1856 | 0.3643 | 0.2559 | 0.3861 | 0.4067 | 0.2057 | 0.3469 | 0.566 | 0.491 | 0.623 | 0.1976 | 0.4177 | 0.153 | 0.304 | 0.1065 | 0.3262 | 0.2456 | 0.3627 |
|
| 77 |
+
| No log | 6.0 | 324 | 1.2752 | 0.2457 | 0.5155 | 0.1921 | 0.0619 | 0.1828 | 0.3811 | 0.252 | 0.3971 | 0.409 | 0.1393 | 0.349 | 0.5878 | 0.4952 | 0.6113 | 0.226 | 0.4367 | 0.1528 | 0.2987 | 0.1045 | 0.3354 | 0.25 | 0.3631 |
|
| 78 |
+
| No log | 7.0 | 378 | 1.2740 | 0.2387 | 0.485 | 0.1904 | 0.0591 | 0.1765 | 0.3802 | 0.2493 | 0.3939 | 0.4117 | 0.1772 | 0.3444 | 0.5928 | 0.4957 | 0.6072 | 0.191 | 0.4291 | 0.1642 | 0.3179 | 0.0895 | 0.3308 | 0.2531 | 0.3733 |
|
| 79 |
+
| No log | 8.0 | 432 | 1.2893 | 0.2451 | 0.5243 | 0.1957 | 0.0594 | 0.1775 | 0.4036 | 0.254 | 0.3924 | 0.4101 | 0.1689 | 0.3345 | 0.6044 | 0.5059 | 0.6297 | 0.216 | 0.4443 | 0.16 | 0.3071 | 0.1088 | 0.3231 | 0.2347 | 0.3462 |
|
| 80 |
+
| No log | 9.0 | 486 | 1.2677 | 0.2524 | 0.5351 | 0.206 | 0.0819 | 0.1917 | 0.3979 | 0.2556 | 0.396 | 0.4154 | 0.1693 | 0.3597 | 0.5841 | 0.4969 | 0.6158 | 0.2219 | 0.4013 | 0.1633 | 0.3388 | 0.1241 | 0.34 | 0.2559 | 0.3813 |
|
| 81 |
+
| 0.833 | 10.0 | 540 | 1.2681 | 0.2576 | 0.5323 | 0.2251 | 0.0695 | 0.1994 | 0.4131 | 0.2646 | 0.3956 | 0.4126 | 0.1896 | 0.3505 | 0.5769 | 0.5036 | 0.6207 | 0.2149 | 0.4025 | 0.1637 | 0.3205 | 0.1487 | 0.36 | 0.2571 | 0.3591 |
|
| 82 |
+
| 0.833 | 11.0 | 594 | 1.2681 | 0.2623 | 0.5318 | 0.2279 | 0.076 | 0.2074 | 0.3998 | 0.2672 | 0.4044 | 0.42 | 0.1887 | 0.3456 | 0.5935 | 0.5011 | 0.6144 | 0.2525 | 0.4468 | 0.1579 | 0.3263 | 0.1362 | 0.3538 | 0.2636 | 0.3587 |
|
| 83 |
+
| 0.833 | 12.0 | 648 | 1.2435 | 0.2612 | 0.5394 | 0.2215 | 0.0746 | 0.2079 | 0.4025 | 0.2698 | 0.4087 | 0.4249 | 0.1908 | 0.3628 | 0.6076 | 0.5073 | 0.6369 | 0.2444 | 0.457 | 0.1779 | 0.3281 | 0.1065 | 0.3292 | 0.2698 | 0.3733 |
|
| 84 |
+
| 0.833 | 13.0 | 702 | 1.2347 | 0.2547 | 0.5166 | 0.2106 | 0.0596 | 0.1889 | 0.4118 | 0.2618 | 0.4061 | 0.4251 | 0.184 | 0.3569 | 0.6118 | 0.5167 | 0.6441 | 0.184 | 0.4316 | 0.1704 | 0.3205 | 0.1439 | 0.3523 | 0.2582 | 0.3769 |
|
| 85 |
+
| 0.833 | 14.0 | 756 | 1.2483 | 0.2502 | 0.5253 | 0.2092 | 0.0552 | 0.1932 | 0.4009 | 0.2668 | 0.3988 | 0.4144 | 0.1771 | 0.35 | 0.5858 | 0.5073 | 0.632 | 0.2228 | 0.4329 | 0.1412 | 0.2893 | 0.1277 | 0.3492 | 0.252 | 0.3684 |
|
| 86 |
+
| 0.833 | 15.0 | 810 | 1.2366 | 0.2681 | 0.5512 | 0.2307 | 0.083 | 0.2029 | 0.4108 | 0.2829 | 0.4078 | 0.4266 | 0.2061 | 0.3562 | 0.6082 | 0.5168 | 0.6329 | 0.2317 | 0.4418 | 0.17 | 0.3214 | 0.1544 | 0.3523 | 0.2677 | 0.3844 |
|
| 87 |
+
| 0.833 | 16.0 | 864 | 1.2437 | 0.2666 | 0.5511 | 0.2216 | 0.0737 | 0.1973 | 0.4246 | 0.2775 | 0.407 | 0.4238 | 0.1515 | 0.3656 | 0.6009 | 0.515 | 0.6329 | 0.257 | 0.4519 | 0.1582 | 0.3187 | 0.1488 | 0.3415 | 0.2539 | 0.3738 |
|
| 88 |
+
| 0.833 | 17.0 | 918 | 1.2170 | 0.2537 | 0.5332 | 0.2114 | 0.0666 | 0.1943 | 0.4116 | 0.2708 | 0.4018 | 0.4221 | 0.1883 | 0.3497 | 0.614 | 0.5208 | 0.6387 | 0.2244 | 0.4316 | 0.1513 | 0.3089 | 0.122 | 0.3477 | 0.25 | 0.3836 |
|
| 89 |
+
| 0.833 | 18.0 | 972 | 1.2061 | 0.2674 | 0.5617 | 0.2288 | 0.0731 | 0.1976 | 0.4235 | 0.2716 | 0.4132 | 0.4301 | 0.1851 | 0.3657 | 0.6035 | 0.5362 | 0.6495 | 0.2421 | 0.4456 | 0.1669 | 0.3366 | 0.1295 | 0.34 | 0.2621 | 0.3787 |
|
| 90 |
+
| 0.7443 | 19.0 | 1026 | 1.1953 | 0.2771 | 0.5592 | 0.2447 | 0.0761 | 0.218 | 0.4372 | 0.2833 | 0.4263 | 0.4401 | 0.1777 | 0.3742 | 0.6354 | 0.5445 | 0.6644 | 0.2516 | 0.4608 | 0.1731 | 0.3326 | 0.1349 | 0.3415 | 0.2812 | 0.4013 |
|
| 91 |
+
| 0.7443 | 20.0 | 1080 | 1.1982 | 0.2769 | 0.5665 | 0.232 | 0.0856 | 0.2144 | 0.4313 | 0.2864 | 0.4251 | 0.4413 | 0.1965 | 0.3739 | 0.6222 | 0.532 | 0.655 | 0.254 | 0.4608 | 0.1735 | 0.342 | 0.1434 | 0.3554 | 0.2815 | 0.3933 |
|
| 92 |
+
| 0.7443 | 21.0 | 1134 | 1.1917 | 0.2799 | 0.5707 | 0.2321 | 0.084 | 0.2222 | 0.4421 | 0.2853 | 0.4271 | 0.4431 | 0.1967 | 0.3825 | 0.6202 | 0.5329 | 0.6527 | 0.2616 | 0.4595 | 0.1746 | 0.3379 | 0.1469 | 0.3692 | 0.2836 | 0.396 |
|
| 93 |
+
| 0.7443 | 22.0 | 1188 | 1.2034 | 0.2786 | 0.5654 | 0.2448 | 0.0875 | 0.218 | 0.4295 | 0.2856 | 0.4168 | 0.4357 | 0.1928 | 0.3706 | 0.6164 | 0.5331 | 0.6477 | 0.2691 | 0.4582 | 0.1778 | 0.3326 | 0.1363 | 0.3538 | 0.277 | 0.3862 |
|
| 94 |
+
| 0.7443 | 23.0 | 1242 | 1.1921 | 0.2811 | 0.5663 | 0.2461 | 0.084 | 0.2254 | 0.4305 | 0.2891 | 0.4236 | 0.4401 | 0.197 | 0.3786 | 0.6131 | 0.5379 | 0.65 | 0.2628 | 0.4671 | 0.1765 | 0.3411 | 0.1347 | 0.3508 | 0.2937 | 0.3916 |
|
| 95 |
+
| 0.7443 | 24.0 | 1296 | 1.1924 | 0.2831 | 0.5765 | 0.2477 | 0.0844 | 0.2258 | 0.4344 | 0.2918 | 0.4295 | 0.442 | 0.1963 | 0.3763 | 0.6205 | 0.5386 | 0.6468 | 0.2751 | 0.4671 | 0.1773 | 0.3451 | 0.1391 | 0.3585 | 0.2854 | 0.3924 |
|
| 96 |
+
| 0.7443 | 25.0 | 1350 | 1.1894 | 0.2847 | 0.5789 | 0.2472 | 0.0792 | 0.2248 | 0.4416 | 0.2919 | 0.4282 | 0.4425 | 0.1865 | 0.3836 | 0.6192 | 0.5361 | 0.65 | 0.2829 | 0.4646 | 0.1797 | 0.3388 | 0.1424 | 0.3615 | 0.2824 | 0.3973 |
|
| 97 |
+
| 0.7443 | 26.0 | 1404 | 1.1916 | 0.2863 | 0.5845 | 0.249 | 0.0814 | 0.2262 | 0.4424 | 0.2915 | 0.4247 | 0.4401 | 0.1878 | 0.3778 | 0.6205 | 0.5428 | 0.6464 | 0.2767 | 0.4519 | 0.1857 | 0.3491 | 0.1434 | 0.3631 | 0.2829 | 0.3902 |
|
| 98 |
+
| 0.7443 | 27.0 | 1458 | 1.1918 | 0.2857 | 0.5838 | 0.2461 | 0.0794 | 0.2267 | 0.446 | 0.294 | 0.4247 | 0.4385 | 0.1827 | 0.3782 | 0.617 | 0.5387 | 0.6423 | 0.279 | 0.4557 | 0.1807 | 0.3451 | 0.1479 | 0.3615 | 0.2822 | 0.388 |
|
| 99 |
+
| 0.6287 | 28.0 | 1512 | 1.1901 | 0.2861 | 0.5816 | 0.2422 | 0.0819 | 0.2247 | 0.444 | 0.2923 | 0.4242 | 0.4374 | 0.1907 | 0.3735 | 0.6147 | 0.535 | 0.6396 | 0.2776 | 0.4557 | 0.1855 | 0.3442 | 0.1457 | 0.3569 | 0.2866 | 0.3907 |
|
| 100 |
+
| 0.6287 | 29.0 | 1566 | 1.1898 | 0.2859 | 0.5804 | 0.2449 | 0.0814 | 0.2249 | 0.4445 | 0.2918 | 0.4252 | 0.4379 | 0.1889 | 0.3738 | 0.6141 | 0.5354 | 0.6414 | 0.2768 | 0.4582 | 0.1868 | 0.3424 | 0.1449 | 0.3569 | 0.2856 | 0.3907 |
|
| 101 |
+
| 0.6287 | 30.0 | 1620 | 1.1898 | 0.2856 | 0.5806 | 0.245 | 0.0813 | 0.224 | 0.4449 | 0.2913 | 0.425 | 0.4382 | 0.1883 | 0.3741 | 0.6152 | 0.5355 | 0.6414 | 0.2763 | 0.4582 | 0.1854 | 0.3411 | 0.1443 | 0.3585 | 0.2865 | 0.3916 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 102 |
|
| 103 |
|
| 104 |
### Framework versions
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 174079796
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:bc6571a9eb873d56f59eff2b0b5ec63172aba3a9db0753d881d3d08e5d5a5bb0
|
| 3 |
size 174079796
|
runs/Sep03_05-13-01_efae7a241144/events.out.tfevents.1756876386.efae7a241144.913.8
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:127fb140886d8ce9885487faff8061af0e97a2f4e086804da2ba84cb76ea7de6
|
| 3 |
+
size 43368
|
training_args.bin
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 5777
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:daf49ea6f68209100723d59749d72f3ea007e168cd6818f3f821b84dfc0a0f40
|
| 3 |
size 5777
|