100epoch_test_march19

This model is a fine-tuned version of nielsr/lilt-xlm-roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3000
  • Precision: 0.9344
  • Recall: 0.9317
  • F1: 0.9331
  • Accuracy: 0.9725

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Accuracy F1 Validation Loss Precision Recall
No log 0.7937 100 0.9349 0.8465 0.2505 0.8337 0.8596
No log 1.5873 200 0.9581 0.8996 0.1602 0.8918 0.9075
No log 2.3810 300 0.9663 0.9162 0.1457 0.8993 0.9336
No log 3.1746 400 0.9682 0.9204 0.1426 0.9291 0.9119
0.2582 3.9683 500 0.9675 0.9205 0.1340 0.9289 0.9123
0.2582 4.7619 600 0.9714 0.9300 0.1387 0.9318 0.9281
0.2582 5.5556 700 0.9702 0.9284 0.1456 0.9283 0.9285
0.2582 6.3492 800 0.9696 0.9247 0.1438 0.9278 0.9216
0.2582 7.1429 900 0.9687 0.9218 0.1592 0.9349 0.9090
0.0527 7.9365 1000 0.9714 0.9280 0.1470 0.9255 0.9306
0.0527 8.7302 1100 0.9706 0.9286 0.1570 0.9322 0.9250
0.0527 9.5238 1200 0.9718 0.9301 0.1627 0.9295 0.9308
0.0527 10.3175 1300 0.9687 0.9215 0.1997 0.9200 0.9229
0.0527 11.1111 1400 0.9723 0.9330 0.1864 0.9296 0.9365
0.0296 11.9048 1500 0.9699 0.9249 0.1912 0.9277 0.9222
0.0296 12.6984 1600 0.9710 0.9282 0.1834 0.9288 0.9277
0.0296 13.4921 1700 0.9696 0.9264 0.2040 0.9275 0.9252
0.0296 14.2857 1800 0.9686 0.9260 0.2004 0.9315 0.9207
0.0296 15.0794 1900 0.9689 0.9229 0.1982 0.9233 0.9226
0.0182 15.8730 2000 0.9700 0.9262 0.2130 0.9259 0.9266
0.0182 16.6667 2100 0.9708 0.9274 0.1958 0.9297 0.9252
0.0182 17.4603 2200 0.9693 0.9233 0.2237 0.9298 0.9168
0.0182 18.2540 2300 0.9691 0.9245 0.2224 0.9276 0.9214
0.0182 19.0476 2400 0.9728 0.9333 0.2082 0.9359 0.9308
0.0138 19.8413 2500 0.9709 0.9284 0.2172 0.9267 0.9302
0.0138 20.6349 2600 0.9718 0.9313 0.1997 0.9347 0.9279
0.0138 21.4286 2700 0.9687 0.9240 0.2395 0.9297 0.9184
0.0138 22.2222 2800 0.9698 0.9254 0.2342 0.9251 0.9256
0.0138 23.0159 2900 0.9709 0.9281 0.2079 0.9311 0.9250
0.0092 23.8095 3000 0.9693 0.9245 0.2388 0.9219 0.9271
0.0092 24.6032 3100 0.9687 0.9237 0.2426 0.9190 0.9285
0.0092 25.3968 3200 0.9706 0.9268 0.2179 0.9299 0.9237
0.0092 26.1905 3300 0.9704 0.9259 0.2348 0.9278 0.9241
0.0092 26.9841 3400 0.9696 0.9257 0.2414 0.9323 0.9191
0.0062 27.7778 3500 0.9693 0.9244 0.2320 0.9232 0.9256
0.0062 28.5714 3600 0.9700 0.9246 0.2327 0.9227 0.9266
0.0062 29.3651 3700 0.9691 0.9245 0.2459 0.9232 0.9258
0.0062 30.1587 3800 0.9709 0.9279 0.2185 0.9304 0.9254
0.0062 30.9524 3900 0.9719 0.9290 0.2355 0.9305 0.9275
0.0045 31.7460 4000 0.9725 0.9316 0.2271 0.9324 0.9308
0.0045 32.5397 4100 0.9710 0.9270 0.2464 0.9265 0.9275
0.0045 33.3333 4200 0.9693 0.9253 0.2522 0.9254 0.9252
0.0045 34.1270 4300 0.9693 0.9259 0.2315 0.9217 0.9300
0.0045 34.9206 4400 0.9700 0.9258 0.2301 0.9261 0.9254
0.0039 35.7143 4500 0.9694 0.9242 0.2511 0.9291 0.9193
0.0039 36.5079 4600 0.9690 0.9236 0.2425 0.9232 0.9241
0.0039 37.3016 4700 0.9664 0.9167 0.2968 0.9257 0.9079
0.0039 38.0952 4800 0.9684 0.9223 0.2725 0.9201 0.9245
0.0039 38.8889 4900 0.9690 0.9225 0.2881 0.9252 0.9199
0.0032 39.6825 5000 0.9678 0.9240 0.2804 0.9231 0.9249
0.0032 40.4762 5100 0.9687 0.9248 0.2689 0.9297 0.9201
0.0032 41.2698 5200 0.9706 0.9276 0.2727 0.9256 0.9296
0.0032 42.0635 5300 0.9714 0.9293 0.2650 0.9299 0.9287
0.0032 42.8571 5400 0.9699 0.9261 0.2633 0.9292 0.9231
0.003 43.6508 5500 0.9710 0.9289 0.2668 0.9327 0.9250
0.003 44.4444 5600 0.9701 0.9281 0.2654 0.9314 0.9249
0.003 45.2381 5700 0.9704 0.9275 0.2701 0.9351 0.9201
0.003 46.0317 5800 0.9730 0.9332 0.2541 0.9329 0.9334
0.003 46.8254 5900 0.9715 0.9298 0.2486 0.9310 0.9287
0.0026 47.6190 6000 0.9726 0.9318 0.2248 0.9392 0.9245
0.0026 48.4127 6100 0.9716 0.9305 0.2562 0.9297 0.9313
0.0026 49.2063 6200 0.9730 0.9341 0.2648 0.9345 0.9336
0.0026 50.0 6300 0.9714 0.9296 0.2561 0.9308 0.9285
0.0026 50.7937 6400 0.9733 0.9331 0.2531 0.9359 0.9304
0.0014 51.5873 6500 0.9712 0.9290 0.2676 0.9356 0.9226
0.0014 52.3810 6600 0.9680 0.9204 0.2842 0.9276 0.9134
0.0014 53.1746 6700 0.9717 0.9311 0.2731 0.9274 0.9350
0.0014 53.9683 6800 0.9719 0.9306 0.2665 0.9319 0.9292
0.0014 54.7619 6900 0.9719 0.9307 0.2690 0.9294 0.9319
0.0017 55.5556 7000 0.9727 0.9319 0.2668 0.9326 0.9311
0.0017 56.3492 7100 0.9707 0.9268 0.2867 0.9219 0.9317
0.0017 57.1429 7200 0.9721 0.9324 0.2771 0.9306 0.9342
0.0017 57.9365 7300 0.9723 0.9336 0.2570 0.9372 0.9300
0.0017 58.7302 7400 0.9735 0.9348 0.2649 0.9363 0.9332
0.0008 59.5238 7500 0.9724 0.9327 0.2780 0.9325 0.9329
0.0008 60.3175 7600 0.9706 0.9270 0.2728 0.9263 0.9277
0.0008 61.1111 7700 0.9718 0.9315 0.2703 0.9268 0.9363
0.0008 61.9048 7800 0.9715 0.9297 0.2799 0.9288 0.9306
0.0008 62.6984 7900 0.9709 0.9275 0.2793 0.9233 0.9317
0.0018 63.4921 8000 0.9716 0.9296 0.2685 0.9271 0.9321
0.0018 64.2857 8100 0.9715 0.9301 0.2779 0.9293 0.9308
0.0018 65.0794 8200 0.9725 0.9316 0.2849 0.9314 0.9319
0.0018 65.8730 8300 0.9712 0.9291 0.2952 0.9246 0.9336
0.0018 66.6667 8400 0.9717 0.9308 0.2932 0.9250 0.9367
0.0008 67.4603 8500 0.9727 0.9320 0.2614 0.9339 0.9300
0.0008 68.2540 8600 0.9723 0.9305 0.2751 0.9311 0.9300
0.0008 69.0476 8700 0.9724 0.9306 0.2829 0.9351 0.9262
0.0008 69.8413 8800 0.9726 0.9318 0.2835 0.9294 0.9342
0.0008 70.6349 8900 0.9720 0.9302 0.2687 0.9284 0.9321
0.0007 71.4286 9000 0.9718 0.9291 0.2631 0.9294 0.9289
0.0007 72.2222 9100 0.9727 0.9316 0.2608 0.9324 0.9308
0.0007 73.0159 9200 0.9728 0.9329 0.2705 0.9307 0.9352
0.0007 73.8095 9300 0.9724 0.9319 0.2822 0.9288 0.9350
0.0007 74.6032 9400 0.9739 0.9351 0.2753 0.9361 0.9340
0.0006 75.3968 9500 0.9719 0.9304 0.2880 0.9329 0.9279
0.0006 76.1905 9600 0.9726 0.9314 0.2954 0.9290 0.9338
0.0006 76.9841 9700 0.9731 0.9334 0.2967 0.9343 0.9325
0.0006 77.7778 9800 0.9733 0.9339 0.3013 0.9327 0.9352
0.0006 78.5714 9900 0.9740 0.9355 0.2980 0.9360 0.9350
0.0002 79.3651 10000 0.9735 0.9344 0.3020 0.9334 0.9353
0.0002 80.1587 10100 0.2857 0.9368 0.9357 0.9363 0.9736
0.0002 80.9524 10200 0.2948 0.9214 0.9346 0.9279 0.9712
0.0002 81.7460 10300 0.2918 0.9285 0.9270 0.9277 0.9711
0.0002 82.5397 10400 0.2947 0.9268 0.9323 0.9295 0.9718
0.0003 83.3333 10500 0.2976 0.9300 0.9323 0.9311 0.9724
0.0003 84.1270 10600 0.2980 0.9302 0.9310 0.9306 0.9722
0.0003 84.9206 10700 0.2978 0.9310 0.9344 0.9327 0.9729
0.0003 85.7143 10800 0.2882 0.9318 0.9327 0.9322 0.9727
0.0003 86.5079 10900 0.2918 0.9314 0.9319 0.9316 0.9725
0.0003 87.3016 11000 0.2932 0.9305 0.9325 0.9315 0.9724
0.0003 88.0952 11100 0.2964 0.9305 0.9323 0.9314 0.9723
0.0003 88.8889 11200 0.3069 0.9307 0.9294 0.9301 0.9716
0.0003 89.6825 11300 0.3022 0.9309 0.9321 0.9315 0.9724
0.0003 90.4762 11400 0.3088 0.9330 0.9289 0.9309 0.9715
0.0002 91.2698 11500 0.3008 0.9323 0.9292 0.9307 0.9715
0.0002 92.0635 11600 0.3019 0.9314 0.9304 0.9309 0.9723
0.0002 92.8571 11700 0.3039 0.9309 0.9302 0.9305 0.9721
0.0002 93.6508 11800 0.3017 0.9345 0.9287 0.9316 0.9722
0.0002 94.4444 11900 0.2980 0.9340 0.9311 0.9326 0.9724
0.0003 95.2381 12000 0.2973 0.9346 0.9302 0.9324 0.9725
0.0003 96.0317 12100 0.2981 0.9345 0.9304 0.9324 0.9724
0.0003 96.8254 12200 0.2990 0.9337 0.9292 0.9315 0.9721
0.0003 97.6190 12300 0.2993 0.9337 0.9298 0.9318 0.9721
0.0003 98.4127 12400 0.2993 0.9343 0.9310 0.9326 0.9724
0.0001 99.2063 12500 0.2998 0.9340 0.9315 0.9328 0.9724
0.0001 100.0 12600 0.3000 0.9344 0.9317 0.9331 0.9725

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.4.1
  • Tokenizers 0.21.1
Downloads last month
9
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for bashyaldhiraj2067/100epoch_test_march19

Finetuned
(32)
this model

Space using bashyaldhiraj2067/100epoch_test_march19 1

Evaluation results