ailab-bio/PROTAC-Splitter-EncoderDecoder-lr_cosine_restarts-rand-smiles
This model is a fine-tuned version of seyonec/ChemBERTa-zinc-base-v1 on the ailab-bio/PROTAC-Splitter-Dataset dataset. It achieves the following results on the evaluation set:
- Loss: 0.3584
- Linker Has Attachment Point(s): 0.9983
- E3 Valid: 0.9934
- E3 Graph Edit Distance Norm: inf
- E3 Has Attachment Point(s): 0.9934
- E3 Heavy Atoms Difference: 0.1883
- Poi Valid: 0.9567
- Poi Tanimoto Similarity: 0.0
- Has All Attachment Points: 0.9939
- Num Fragments: 2.9997
- Reassembly: 0.6015
- Poi Equal: 0.7927
- Poi Graph Edit Distance: inf
- Linker Graph Edit Distance Norm: inf
- Linker Valid: 0.9983
- Linker Heavy Atoms Difference Norm: 0.0036
- Poi Has Attachment Point(s): 0.9567
- Valid: 0.9493
- Poi Heavy Atoms Difference Norm: 0.0441
- Tanimoto Similarity: 0.0
- Linker Tanimoto Similarity: 0.0
- Linker Graph Edit Distance: inf
- Linker Equal: 0.8512
- Heavy Atoms Difference Norm: 0.0570
- E3 Heavy Atoms Difference Norm: 0.0026
- Heavy Atoms Difference: 4.2382
- Poi Graph Edit Distance Norm: inf
- Linker Heavy Atoms Difference: 0.2012
- Reassembly Nostereo: 0.6357
- E3 Graph Edit Distance: inf
- E3 Tanimoto Similarity: 0.0
- Has Three Substructures: 0.9997
- E3 Equal: 0.8282
- Poi Heavy Atoms Difference: 1.3882
- All Ligands Equal: 0.5946
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 800
- training_steps: 100000
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | All Ligands Equal | E3 Equal | E3 Graph Edit Distance | E3 Graph Edit Distance Norm | E3 Has Attachment Point(s) | E3 Heavy Atoms Difference | E3 Heavy Atoms Difference Norm | E3 Tanimoto Similarity | E3 Valid | Has All Attachment Points | Has Three Substructures | Heavy Atoms Difference | Heavy Atoms Difference Norm | Linker Equal | Linker Graph Edit Distance | Linker Graph Edit Distance Norm | Linker Has Attachment Point(s) | Linker Heavy Atoms Difference | Linker Heavy Atoms Difference Norm | Linker Tanimoto Similarity | Linker Valid | Validation Loss | Num Fragments | Poi Equal | Poi Graph Edit Distance | Poi Graph Edit Distance Norm | Poi Has Attachment Point(s) | Poi Heavy Atoms Difference | Poi Heavy Atoms Difference Norm | Poi Tanimoto Similarity | Poi Valid | Reassembly | Reassembly Nostereo | Tanimoto Similarity | Valid |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.0212 | 0.4932 | 5000 | 0.4396 | 0.7734 | inf | inf | 0.9979 | 0.1237 | -0.0043 | 0.0 | 0.9979 | 0.9903 | 0.9994 | 4.6065 | 0.0591 | 0.6416 | 24787535410764872823308202666266910179524056896382338942894080.0000 | inf | 0.9975 | 0.5099 | 0.0050 | 0.0 | 0.9975 | 0.2852 | 3.0001 | 0.7208 | inf | inf | 0.9528 | 1.4706 | 0.0414 | 0.0 | 0.9528 | 0.4464 | 0.4783 | 0.0 | 0.9500 |
| 0.011 | 0.7398 | 7500 | 0.4827 | 0.7827 | inf | inf | 0.9905 | 0.5439 | 0.0117 | 0.0 | 0.9905 | 0.9858 | 0.9990 | 6.6114 | 0.0846 | 0.7075 | 27443342776203963485595838968590031296472552664861602492710912.0000 | 0.0511 | 0.9973 | 0.3255 | -0.0008 | 0.0 | 0.9973 | 0.3022 | 3.0001 | 0.7332 | inf | inf | 0.9389 | 2.2561 | 0.0673 | 0.0 | 0.9389 | 0.4896 | 0.5176 | 0.0 | 0.9294 |
| 0.007 | 0.9864 | 10000 | 0.5318 | 0.8017 | inf | inf | 0.9922 | 0.2046 | -0.0014 | 0.0 | 0.9922 | 0.9900 | 0.9991 | 4.5184 | 0.0596 | 0.7645 | inf | inf | 0.9980 | 0.3372 | 0.0044 | 0.0 | 0.9980 | 0.3093 | 2.9998 | 0.7580 | inf | inf | 0.9534 | 1.3755 | 0.0422 | 0.0 | 0.9534 | 0.5402 | 0.5728 | 0.0 | 0.9458 |
| 0.0005 | 6.9047 | 70000 | 0.3363 | 0.9977 | 0.9890 | inf | 0.9890 | 0.3207 | 0.9573 | 0.0 | 0.9891 | 2.9997 | 0.5983 | 0.7900 | inf | inf | 0.9977 | 0.0009 | 0.9573 | 0.9452 | 0.0404 | 0.0 | 0.0 | 23016997167138810478786188190104988023843767118069314732687360.0000 | 0.8439 | 0.0615 | 0.0015 | 4.7337 | inf | 0.2010 | 0.6325 | inf | 0.0 | 0.9996 | 0.8273 | 1.3261 | 0.5905 |
| 0.0005 | 7.1513 | 72500 | 0.3347 | 0.9974 | 0.9887 | inf | 0.9887 | 0.3786 | 0.9609 | 0.0 | 0.9926 | 3.0003 | 0.5987 | 0.7891 | inf | inf | 0.9974 | 0.0033 | 0.9609 | 0.9487 | 0.0401 | 0.0 | 0.0 | 25672804532577903995569209904347871257364201785538851047997440.0000 | 0.8407 | 0.0588 | 0.0044 | 4.5035 | inf | 0.2400 | 0.6343 | inf | 0.0 | 0.9994 | 0.8260 | 1.3144 | 0.5911 |
| 0.0008 | 7.3979 | 75000 | 0.3530 | 0.9965 | 0.9897 | inf | 0.9897 | 0.2380 | 0.9452 | 0.0 | 0.9885 | 2.9999 | 0.5951 | 0.7843 | inf | inf | 0.9965 | 0.0028 | 0.9452 | 0.9335 | 0.0534 | 0.0 | 0.0 | inf | 0.8362 | 0.0716 | -0.0004 | 5.4047 | inf | 0.2346 | 0.6308 | inf | 0.0 | 0.9992 | 0.8239 | 1.6527 | 0.5876 |
| 0.0016 | 7.6445 | 77500 | 0.3554 | 0.9958 | 0.9887 | inf | 0.9887 | 0.4638 | 0.9404 | 0.0 | 0.9874 | 3.0001 | 0.5911 | 0.7801 | inf | inf | 0.9958 | 0.0081 | 0.9404 | 0.9281 | 0.0595 | 0.0 | 0.0 | 41607648725212467950762725601724932775058748689346341705351168.0000 | 0.8252 | 0.0795 | 0.0085 | 5.9789 | inf | 0.3348 | 0.6265 | inf | 0.0 | 0.9988 | 0.8217 | 1.7751 | 0.5832 |
| 0.0004 | 7.8911 | 80000 | 0.3551 | 0.9988 | 0.9888 | inf | 0.9888 | 0.4881 | 0.9533 | 0.0 | 0.9912 | 2.9999 | 0.6013 | 0.7877 | inf | inf | 0.9988 | -0.0009 | 0.9533 | 0.9417 | 0.0421 | 0.0 | 0.0 | inf | 0.8503 | 0.0657 | 0.0099 | 5.0296 | inf | 0.1533 | 0.6381 | inf | 0.0 | 0.9997 | 0.8263 | 1.4026 | 0.5946 |
| 0.0005 | 8.1377 | 82500 | 0.3597 | 0.9981 | 0.9889 | inf | 0.9889 | 0.1790 | 0.9525 | 0.0 | 0.9868 | 3.0002 | 0.6001 | 0.7882 | inf | inf | 0.9981 | 0.0018 | 0.9525 | 0.9414 | 0.0426 | 0.0 | 0.0 | 19475920679886688644237544649700905829055126460433539077767168.0000 | 0.8470 | 0.0655 | 0.0028 | 4.8411 | inf | 0.1896 | 0.6345 | inf | 0.0 | 0.9995 | 0.8283 | 1.3738 | 0.5931 |
| 0.0006 | 8.3843 | 85000 | 0.3532 | 0.9969 | 0.9912 | inf | 0.9912 | 0.1049 | 0.9551 | 0.0 | 0.9896 | 2.9999 | 0.5975 | 0.7880 | inf | inf | 0.9969 | 0.0045 | 0.9551 | 0.9444 | 0.0424 | 0.0 | 0.0 | inf | 0.8409 | 0.0589 | -0.0023 | 4.4111 | inf | 0.2537 | 0.6296 | inf | 0.0 | 0.9997 | 0.8255 | 1.3678 | 0.5902 |
| 0.0015 | 8.6309 | 87500 | 0.3422 | 0.9975 | 0.9941 | inf | 0.9941 | 0.1578 | 0.9388 | 0.0 | 0.9927 | 2.9999 | 0.5950 | 0.7813 | inf | inf | 0.9975 | 0.0093 | 0.9388 | 0.9317 | 0.0561 | 0.0 | 0.0 | 24787535410764872823308202666266910179524056896382338942894080.0000 | 0.8415 | 0.0740 | -0.0025 | 5.6405 | inf | 0.3237 | 0.6294 | inf | 0.0 | 0.9999 | 0.8237 | 1.7777 | 0.5883 |
| 0.0004 | 8.8775 | 90000 | 0.3554 | 0.9982 | 0.9916 | inf | 0.9916 | 0.1501 | 0.9603 | 0.0 | 0.9942 | 2.9996 | 0.6016 | 0.7898 | inf | inf | 0.9982 | 0.0017 | 0.9603 | 0.9511 | 0.0345 | 0.0 | 0.0 | 17705382436260626299715530173538983673374836682120514867560448.0000 | 0.8493 | 0.0532 | -0.0026 | 4.1205 | inf | 0.1871 | 0.6375 | inf | 0.0 | 0.9996 | 0.8269 | 1.1763 | 0.5943 |
| 0.0004 | 9.1241 | 92500 | 0.3515 | 0.9974 | 0.9851 | inf | 0.9851 | 0.4341 | 0.9510 | 0.0 | 0.9898 | 2.9996 | 0.6034 | 0.7930 | inf | inf | 0.9974 | 0.0031 | 0.9510 | 0.9348 | 0.0433 | 0.0 | 0.0 | inf | 0.8492 | 0.0691 | 0.0076 | 5.2700 | inf | 0.2049 | 0.6379 | inf | 0.0 | 0.9996 | 0.8267 | 1.4062 | 0.5962 |
| 0.0006 | 9.3707 | 95000 | 0.3472 | 0.9978 | 0.9886 | inf | 0.9886 | 0.3374 | 0.9533 | 0.0 | 0.9892 | 2.9997 | 0.5980 | 0.7876 | inf | inf | 0.9978 | 0.0030 | 0.9533 | 0.9408 | 0.0424 | 0.0 | 0.0 | inf | 0.8475 | 0.0656 | 0.0104 | 4.8067 | inf | 0.2247 | 0.6291 | inf | 0.0 | 0.9997 | 0.8252 | 1.2974 | 0.5910 |
| 0.0009 | 9.6173 | 97500 | 0.3462 | 0.9970 | 0.9955 | inf | 0.9955 | 0.2014 | 0.9459 | 0.0 | 0.9920 | 2.9999 | 0.5945 | 0.7863 | inf | inf | 0.9970 | -0.0004 | 0.9459 | 0.9390 | 0.0480 | 0.0 | 0.0 | 30099150141643059856874246094752676646564926231321411573514240.0000 | 0.8381 | 0.0651 | 0.0015 | 4.9979 | inf | 0.1987 | 0.6289 | inf | 0.0 | 0.9999 | 0.8247 | 1.5948 | 0.5873 |
| 0.0004 | 9.8639 | 100000 | 0.3584 | 0.9983 | 0.9934 | inf | 0.9934 | 0.1883 | 0.9567 | 0.0 | 0.9939 | 2.9997 | 0.6015 | 0.7927 | inf | inf | 0.9983 | 0.0036 | 0.9567 | 0.9493 | 0.0441 | 0.0 | 0.0 | inf | 0.8512 | 0.0570 | 0.0026 | 4.2382 | inf | 0.2012 | 0.6357 | inf | 0.0 | 0.9997 | 0.8282 | 1.3882 | 0.5946 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.0
- Tokenizers 0.19.1
- Downloads last month
- 6
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for ailab-bio/PROTAC-Splitter-EncoderDecoder-lr_cosine_restarts-rand-smiles
Base model
seyonec/ChemBERTa-zinc-base-v1