ailab-bio/PROTAC-Splitter-EncoderDecoder-lr_cosine_restarts
This model is a fine-tuned version of seyonec/ChemBERTa-zinc-base-v1 on the ailab-bio/PROTAC-Splitter-Dataset dataset. It achieves the following results on the evaluation set:
- Loss: 0.3421
- Linker Heavy Atoms Difference Norm: 0.0015
- Poi Valid: 0.9263
- E3 Graph Edit Distance: inf
- Poi Graph Edit Distance: 737429178470255015019840581324076533822563653950209220564484096.0000
- Poi Has Attachment Point(s): 0.9263
- E3 Has Attachment Point(s): 0.9785
- Poi Heavy Atoms Difference Norm: 0.0735
- Poi Equal: 0.7917
- Linker Tanimoto Similarity: 0.0
- Linker Graph Edit Distance: inf
- Linker Equal: 0.8455
- Linker Valid: 0.9975
- Reassembly: 0.6035
- Has All Attachment Points: 0.9868
- Heavy Atoms Difference Norm: 0.1010
- Reassembly Nostereo: 0.6258
- E3 Graph Edit Distance Norm: inf
- All Ligands Equal: 0.5964
- E3 Equal: 0.8260
- E3 Valid: 0.9785
- E3 Heavy Atoms Difference Norm: 0.0174
- Heavy Atoms Difference: 7.5459
- E3 Tanimoto Similarity: 0.0
- Has Three Substructures: 0.9998
- E3 Heavy Atoms Difference: 0.6771
- Linker Heavy Atoms Difference: 0.2282
- Linker Graph Edit Distance Norm: inf
- Linker Has Attachment Point(s): 0.9975
- Valid: 0.9041
- Poi Tanimoto Similarity: 0.0
- Poi Heavy Atoms Difference: 2.1196
- Tanimoto Similarity: 0.0
- Num Fragments: 3.0002
- Poi Graph Edit Distance Norm: inf
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 800
- training_steps: 100000
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | All Ligands Equal | E3 Equal | E3 Graph Edit Distance | E3 Graph Edit Distance Norm | E3 Has Attachment Point(s) | E3 Heavy Atoms Difference | E3 Heavy Atoms Difference Norm | E3 Tanimoto Similarity | E3 Valid | Has All Attachment Points | Has Three Substructures | Heavy Atoms Difference | Heavy Atoms Difference Norm | Linker Equal | Linker Graph Edit Distance | Linker Graph Edit Distance Norm | Linker Has Attachment Point(s) | Linker Heavy Atoms Difference | Linker Heavy Atoms Difference Norm | Linker Tanimoto Similarity | Linker Valid | Validation Loss | Num Fragments | Poi Equal | Poi Graph Edit Distance | Poi Graph Edit Distance Norm | Poi Has Attachment Point(s) | Poi Heavy Atoms Difference | Poi Heavy Atoms Difference Norm | Poi Tanimoto Similarity | Poi Valid | Reassembly | Reassembly Nostereo | Tanimoto Similarity | Valid |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.0156 | 0.4932 | 5000 | 0.4596 | 0.7815 | inf | inf | 0.9916 | 0.3913 | 0.0061 | 0.0 | 0.9916 | 0.9863 | 0.9988 | 7.0030 | 0.0918 | 0.6623 | 35410764872521246890440289523238443113605795566260484204134400.0000 | inf | 0.9965 | 0.3805 | 0.0005 | 0.0 | 0.9965 | 0.2891 | 3.0006 | 0.7265 | inf | inf | 0.9252 | 2.3129 | 0.0739 | 0.0 | 0.9252 | 0.4665 | 0.4911 | 0.0 | 0.9169 |
| 0.0077 | 0.7398 | 7500 | 0.4898 | 0.7908 | inf | inf | 0.9857 | 0.4845 | 0.0088 | 0.0 | 0.9857 | 0.9812 | 0.9995 | 8.2839 | 0.1093 | 0.7095 | 56657223796033995024704463237181508981769272906016774726615040.0000 | inf | 0.9943 | 0.4483 | 0.0090 | 0.0 | 0.9943 | 0.3073 | 3.0002 | 0.7363 | inf | inf | 0.9161 | 2.5750 | 0.0832 | 0.0 | 0.9161 | 0.4975 | 0.5220 | 0.0 | 0.8987 |
| 0.0046 | 0.9864 | 10000 | 0.5462 | 0.8045 | inf | inf | 0.9896 | 0.5553 | 0.0131 | 0.0 | 0.9896 | 0.9836 | 0.9991 | 7.0102 | 0.0939 | 0.7666 | inf | inf | 0.9961 | 0.3144 | 0.0033 | 0.0 | 0.9961 | 0.3086 | 2.9998 | 0.7680 | inf | inf | 0.9272 | 2.1208 | 0.0719 | 0.0 | 0.9272 | 0.5544 | 0.5796 | 0.0 | 0.9157 |
| 0.0003 | 6.9047 | 70000 | 0.3416 | 0.0013 | 0.9247 | inf | inf | 0.9247 | 0.9850 | 0.0743 | 0.7900 | 0.0 | 28328611898017003221343002442430278724028514250988932894294016.0000 | 0.8408 | 0.9972 | 0.5992 | 0.9859 | 0.0987 | 0.6233 | inf | 0.5914 | 0.8266 | 0.9850 | 0.0135 | 7.3665 | 0.0 | 1.0 | 0.5428 | 0.2086 | inf | 0.9972 | 0.9082 | 0.0 | 2.1749 | 0.0 | 3.0 | inf |
| 0.0004 | 7.1513 | 72500 | 0.3364 | 0.0027 | 0.9271 | inf | inf | 0.9271 | 0.9849 | 0.0707 | 0.7934 | 0.0 | inf | 0.8451 | 0.9971 | 0.6019 | 0.9846 | 0.0960 | 0.6242 | inf | 0.5942 | 0.8275 | 0.9849 | 0.0122 | 7.1871 | 0.0 | 1.0 | 0.5763 | 0.2224 | inf | 0.9971 | 0.9109 | 0.0 | 2.0791 | 0.0 | 3.0 | inf |
| 0.0005 | 7.3979 | 75000 | 0.3489 | 0.0030 | 0.9274 | inf | inf | 0.9274 | 0.9839 | 0.0704 | 0.7870 | 0.0 | inf | 0.8362 | 0.9971 | 0.5971 | 0.9873 | 0.0985 | 0.6222 | inf | 0.5900 | 0.8249 | 0.9839 | 0.0187 | 7.4559 | 0.0 | 0.9997 | 0.6882 | 0.2505 | inf | 0.9971 | 0.9108 | 0.0 | 2.1431 | 0.0 | 2.9997 | inf |
| 0.0012 | 7.6445 | 77500 | 0.3375 | 0.0072 | 0.9271 | inf | inf | 0.9271 | 0.9820 | 0.0760 | 0.7898 | 0.0 | inf | 0.8247 | 0.9965 | 0.5921 | 0.9864 | 0.0983 | 0.6127 | inf | 0.5848 | 0.8250 | 0.9820 | 0.0149 | 7.4127 | 0.0 | 0.9997 | 0.6424 | 0.3465 | inf | 0.9965 | 0.9095 | 0.0 | 2.2373 | 0.0 | 2.9997 | inf |
| 0.0003 | 7.8911 | 80000 | 0.3411 | 0.0019 | 0.9295 | inf | inf | 0.9295 | 0.9783 | 0.0750 | 0.7928 | 0.0 | inf | 0.8451 | 0.9971 | 0.6030 | 0.9868 | 0.0997 | 0.6278 | inf | 0.5955 | 0.8263 | 0.9783 | 0.0179 | 7.4869 | 0.0 | 0.9995 | 0.7170 | 0.2144 | inf | 0.9971 | 0.9079 | 0.0 | 2.1623 | 0.0 | 2.9998 | inf |
| 0.0004 | 8.1377 | 82500 | 0.3505 | 0.0009 | 0.9268 | inf | inf | 0.9268 | 0.9796 | 0.0742 | 0.7927 | 0.0 | inf | 0.8377 | 0.9966 | 0.5997 | 0.9863 | 0.1033 | 0.6243 | inf | 0.5924 | 0.8275 | 0.9796 | 0.0176 | 7.7488 | 0.0 | 0.9999 | 0.6833 | 0.2141 | inf | 0.9966 | 0.9047 | 0.0 | 2.1771 | 0.0 | 3.0001 | inf |
| 0.0005 | 8.3843 | 85000 | 0.3462 | 0.0021 | 0.9173 | inf | inf | 0.9173 | 0.9818 | 0.0824 | 0.7883 | 0.0 | inf | 0.8387 | 0.9964 | 0.5984 | 0.9820 | 0.1082 | 0.6215 | inf | 0.5915 | 0.8260 | 0.9818 | 0.0158 | 8.1167 | 0.0 | 0.9996 | 0.6591 | 0.2392 | inf | 0.9964 | 0.8986 | 0.0 | 2.4100 | 0.0 | 3.0 | inf |
| 0.0009 | 8.6309 | 87500 | 0.3325 | 0.0020 | 0.9225 | inf | inf | 0.9225 | 0.9796 | 0.0794 | 0.7900 | 0.0 | inf | 0.8345 | 0.9972 | 0.5966 | 0.9834 | 0.1044 | 0.6196 | inf | 0.5887 | 0.8239 | 0.9796 | 0.0165 | 7.8794 | 0.0 | 0.9996 | 0.6269 | 0.2329 | inf | 0.9972 | 0.9019 | 0.0 | 2.3541 | 0.0 | 2.9996 | inf |
| 0.0003 | 8.8775 | 90000 | 0.3373 | 0.0020 | 0.9258 | inf | inf | 0.9258 | 0.9801 | 0.0744 | 0.7946 | 0.0 | 25672804532577906850064595316267633373936140684529123813490688.0000 | 0.8446 | 0.9974 | 0.6029 | 0.9847 | 0.1014 | 0.6285 | inf | 0.5953 | 0.8287 | 0.9801 | 0.0175 | 7.6187 | 0.0 | 0.9999 | 0.6809 | 0.2217 | inf | 0.9974 | 0.9058 | 0.0 | 2.1887 | 0.0 | 2.9999 | inf |
| 0.0003 | 9.1241 | 92500 | 0.3412 | 0.0040 | 0.9302 | inf | inf | 0.9302 | 0.9817 | 0.0684 | 0.7867 | 0.0 | 31869688385269122201396260570914598802245216009634435783720960.0000 | 0.8426 | 0.9968 | 0.6011 | 0.9828 | 0.0971 | 0.6213 | inf | 0.5941 | 0.8295 | 0.9817 | 0.0157 | 7.3307 | 0.0 | 0.9997 | 0.6020 | 0.2689 | inf | 0.9968 | 0.9106 | 0.0 | 2.0255 | 0.0 | 2.9997 | inf |
| 0.0004 | 9.3707 | 95000 | 0.3420 | 0.0071 | 0.9275 | inf | inf | 0.9275 | 0.9796 | 0.0735 | 0.7880 | 0.0 | 23902266288951841651047195428185949101683912007225826837790720.0000 | 0.8352 | 0.9976 | 0.5992 | 0.9868 | 0.1005 | 0.6217 | inf | 0.5911 | 0.8257 | 0.9796 | 0.0189 | 7.5129 | 0.0 | 0.9996 | 0.6888 | 0.3183 | inf | 0.9976 | 0.9073 | 0.0 | 2.1638 | 0.0 | 2.9998 | inf |
| 0.001 | 9.6173 | 97500 | 0.3328 | 0.0020 | 0.9207 | inf | inf | 0.9207 | 0.9771 | 0.0780 | 0.7890 | 0.0 | 35410764872521246890440289523238443113605795566260484204134400.0000 | 0.8340 | 0.9965 | 0.5977 | 0.9896 | 0.1080 | 0.6202 | inf | 0.5900 | 0.8252 | 0.9771 | 0.0217 | 8.1480 | 0.0 | 0.9993 | 0.7837 | 0.2400 | inf | 0.9965 | 0.8982 | 0.0 | 2.2722 | 0.0 | 3.0004 | inf |
| 0.0003 | 9.8639 | 100000 | 0.3421 | 0.0015 | 0.9263 | inf | 737429178470255015019840581324076533822563653950209220564484096.0000 | 0.9263 | 0.9785 | 0.0735 | 0.7917 | 0.0 | inf | 0.8455 | 0.9975 | 0.6035 | 0.9868 | 0.1010 | 0.6258 | inf | 0.5964 | 0.8260 | 0.9785 | 0.0174 | 7.5459 | 0.0 | 0.9998 | 0.6771 | 0.2282 | inf | 0.9975 | 0.9041 | 0.0 | 2.1196 | 0.0 | 3.0002 | inf |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.0
- Tokenizers 0.19.1
- Downloads last month
- 25
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for ailab-bio/PROTAC-Splitter-EncoderDecoder-lr_cosine_restarts
Base model
seyonec/ChemBERTa-zinc-base-v1