ailab-bio/PROTAC-Splitter-EncoderDecoder-lr_cosine
This model is a fine-tuned version of seyonec/ChemBERTa-zinc-base-v1 on the ailab-bio/PROTAC-Splitter-Dataset dataset. It achieves the following results on the evaluation set:
- Loss: 0.3829
- E3 Graph Edit Distance: inf
- Poi Valid: 0.9394
- Tanimoto Similarity: 0.0
- Has All Attachment Points: 0.9847
- Linker Graph Edit Distance: inf
- Poi Heavy Atoms Difference: 1.8056
- E3 Graph Edit Distance Norm: inf
- Poi Has Attachment Point(s): 0.9394
- Reassembly Nostereo: 0.6318
- Linker Heavy Atoms Difference: 0.1547
- Reassembly: 0.6055
- Poi Equal: 0.7955
- Linker Valid: 0.9978
- Linker Tanimoto Similarity: 0.0
- Heavy Atoms Difference: 6.1064
- Num Fragments: 3.0001
- Has Three Substructures: 0.9992
- E3 Has Attachment Point(s): 0.9871
- Linker Heavy Atoms Difference Norm: -0.0018
- Poi Graph Edit Distance Norm: inf
- Heavy Atoms Difference Norm: 0.0815
- E3 Tanimoto Similarity: 0.0
- E3 Heavy Atoms Difference Norm: 0.0085
- Valid: 0.9267
- E3 Equal: 0.8283
- Linker Equal: 0.8496
- Linker Has Attachment Point(s): 0.9978
- Poi Heavy Atoms Difference Norm: 0.0577
- E3 Valid: 0.9871
- Poi Tanimoto Similarity: 0.0
- All Ligands Equal: 0.5981
- Poi Graph Edit Distance: inf
- Linker Graph Edit Distance Norm: inf
- E3 Heavy Atoms Difference: 0.4042
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- training_steps: 100000
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | All Ligands Equal | E3 Equal | E3 Graph Edit Distance | E3 Graph Edit Distance Norm | E3 Has Attachment Point(s) | E3 Heavy Atoms Difference | E3 Heavy Atoms Difference Norm | E3 Tanimoto Similarity | E3 Valid | Has All Attachment Points | Has Three Substructures | Heavy Atoms Difference | Heavy Atoms Difference Norm | Linker Equal | Linker Graph Edit Distance | Linker Graph Edit Distance Norm | Linker Has Attachment Point(s) | Linker Heavy Atoms Difference | Linker Heavy Atoms Difference Norm | Linker Tanimoto Similarity | Linker Valid | Validation Loss | Num Fragments | Poi Equal | Poi Graph Edit Distance | Poi Graph Edit Distance Norm | Poi Has Attachment Point(s) | Poi Heavy Atoms Difference | Poi Heavy Atoms Difference Norm | Poi Tanimoto Similarity | Poi Valid | Reassembly | Reassembly Nostereo | Tanimoto Similarity | Valid |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.0086 | 0.4932 | 5000 | 0.4899 | 0.7832 | inf | inf | 0.9949 | 0.3592 | 0.0041 | 0.0 | 0.9949 | 0.9820 | 0.9988 | 7.4350 | 0.0974 | 0.7092 | 46033994334277620957572376380209976047687534236138629465374720.0000 | 0.0544 | 0.9954 | 0.6527 | 0.0200 | 0.0 | 0.9954 | 0.2931 | 3.0004 | 0.7416 | inf | inf | 0.9248 | 2.4199 | 0.0785 | 0.0 | 0.9248 | 0.4972 | 0.5238 | 0.0 | 0.9182 |
| 0.004 | 0.7398 | 7500 | 0.5381 | 0.8044 | inf | inf | 0.9924 | 0.4005 | 0.0081 | 0.0 | 0.9924 | 0.9818 | 0.9978 | 6.9868 | 0.0924 | 0.7604 | 59313031161473097104973641187183678565005524270457129338404864.0000 | inf | 0.9941 | 0.2498 | -0.0010 | 0.0 | 0.9941 | 0.3098 | 3.0006 | 0.7603 | inf | inf | 0.9225 | 2.2024 | 0.0709 | 0.0 | 0.9225 | 0.5451 | 0.5729 | 0.0 | 0.9147 |
| 0.003 | 0.9864 | 10000 | 0.5477 | 0.8035 | inf | inf | 0.9942 | 0.3628 | 0.0044 | 0.0 | 0.9942 | 0.9857 | 0.9983 | 6.4929 | 0.0854 | 0.7726 | inf | inf | 0.9951 | 0.3252 | 0.0050 | 0.0 | 0.9951 | 0.3124 | 3.0008 | 0.7673 | inf | inf | 0.9294 | 2.0849 | 0.0668 | 0.0 | 0.9294 | 0.5549 | 0.5845 | 0.0 | 0.9232 |
| 0.0003 | 6.9047 | 70000 | 0.3754 | inf | 0.9355 | 0.0 | 0.9836 | inf | 1.9803 | inf | 0.9355 | 0.6316 | 0.2073 | 0.6043 | 0.7957 | 0.9971 | 0.0 | 6.5326 | 3.0 | 0.9996 | 0.9876 | -0.0003 | inf | 0.0869 | 0.0 | 0.0041 | 0.9217 | 0.8272 | 0.8437 | 0.9971 | 0.0637 | 0.9876 | 0.0 | 0.5963 | inf | inf | 0.3279 |
| 0.0003 | 7.1513 | 72500 | 0.3778 | inf | 0.9374 | 0.0 | 0.9861 | 28328611898016997512352231618590754490884636453008387363307520.0000 | 1.8700 | inf | 0.9374 | 0.6300 | 0.2291 | 0.6030 | 0.7948 | 0.9972 | 0.0 | 6.2126 | 2.9996 | 0.9995 | 0.9891 | 0.0029 | inf | 0.0832 | 0.0 | 0.0052 | 0.9257 | 0.8270 | 0.8452 | 0.9972 | 0.0606 | 0.9891 | 0.0 | 0.5959 | inf | inf | 0.3385 |
| 0.0002 | 7.3979 | 75000 | 0.3793 | inf | 0.9372 | 0.0 | 0.9868 | 26558073654390935167830217142428832335204346674695363153100800.0000 | 1.8756 | inf | 0.9372 | 0.6299 | 0.2073 | 0.6038 | 0.7967 | 0.9973 | 0.0 | 6.3769 | 3.0004 | 0.9996 | 0.9872 | 0.0021 | inf | 0.0854 | 0.0 | 0.0091 | 0.9240 | 0.8280 | 0.8485 | 0.9973 | 0.0608 | 0.9872 | 0.0 | 0.5968 | inf | inf | 0.4008 |
| 0.0002 | 7.6445 | 77500 | 0.3748 | inf | 0.9388 | 0.0 | 0.9822 | 29213881019830025830117853444751953452152842443174626702917632.0000 | 1.7786 | inf | 0.9388 | 0.6312 | 0.1710 | 0.6030 | 0.7936 | 0.9971 | 0.0 | 6.2552 | 3.0004 | 0.9993 | 0.9864 | -0.0014 | inf | 0.0838 | 0.0 | 0.0097 | 0.9256 | 0.8279 | 0.8461 | 0.9971 | 0.0583 | 0.9864 | 0.0 | 0.5958 | inf | inf | 0.4591 |
| 0.0002 | 7.8911 | 80000 | 0.3756 | inf | 0.9376 | 0.0 | 0.9819 | 29213881019830031539108624268591477685296720241155172233904128.0000 | 1.8252 | inf | 0.9376 | 0.6275 | 0.1621 | 0.6034 | 0.7962 | 0.9971 | 0.0 | 6.2378 | 3.0003 | 0.9990 | 0.9868 | -0.0021 | inf | 0.0836 | 0.0 | 0.0076 | 0.9251 | 0.8282 | 0.8482 | 0.9971 | 0.0589 | 0.9868 | 0.0 | 0.5972 | inf | inf | 0.3862 |
| 0.0002 | 8.1377 | 82500 | 0.3801 | inf | 0.9384 | 0.0 | 0.9845 | inf | 1.8311 | inf | 0.9384 | 0.6323 | 0.1549 | 0.6044 | 0.7944 | 0.9973 | 0.0 | 6.1100 | 3.0003 | 0.9994 | 0.9884 | -0.0015 | inf | 0.0817 | 0.0 | 0.0055 | 0.9263 | 0.8283 | 0.8495 | 0.9973 | 0.0602 | 0.9884 | 0.0 | 0.5973 | inf | inf | 0.3154 |
| 0.0002 | 8.3843 | 85000 | 0.3848 | inf | 0.9398 | 0.0 | 0.9845 | inf | 1.7613 | inf | 0.9398 | 0.6299 | 0.1742 | 0.6047 | 0.7948 | 0.9974 | 0.0 | 6.0281 | 2.9998 | 0.9995 | 0.9881 | -0.0006 | inf | 0.0807 | 0.0 | 0.0073 | 0.9275 | 0.8276 | 0.8492 | 0.9974 | 0.0565 | 0.9881 | 0.0 | 0.5976 | inf | inf | 0.3526 |
| 0.0002 | 8.6309 | 87500 | 0.3832 | inf | 0.9403 | 0.0 | 0.9849 | inf | 1.7714 | inf | 0.9403 | 0.6306 | 0.1990 | 0.6055 | 0.7955 | 0.9968 | 0.0 | 6.1018 | 3.0005 | 0.9991 | 0.9884 | 0.0010 | inf | 0.0814 | 0.0 | 0.0073 | 0.9280 | 0.8282 | 0.8494 | 0.9968 | 0.0564 | 0.9884 | 0.0 | 0.5987 | inf | inf | 0.3752 |
| 0.0002 | 8.8775 | 90000 | 0.3839 | inf | 0.9399 | 0.0 | 0.9855 | inf | 1.7940 | inf | 0.9399 | 0.6316 | 0.1469 | 0.6054 | 0.7959 | 0.9980 | 0.0 | 5.8978 | 3.0 | 0.9995 | 0.9896 | -0.0024 | inf | 0.0787 | 0.0 | 0.0045 | 0.9292 | 0.8275 | 0.8491 | 0.9980 | 0.0573 | 0.9896 | 0.0 | 0.5978 | inf | inf | 0.3037 |
| 0.0002 | 9.1241 | 92500 | 0.3835 | inf | 0.9397 | 0.0 | 0.9851 | 24787535410764875677803588078186672296095995795372611708387328.0000 | 1.7906 | inf | 0.9397 | 0.6319 | 0.1631 | 0.6053 | 0.7953 | 0.9975 | 0.0 | 6.0297 | 3.0003 | 0.9992 | 0.9880 | -0.0023 | inf | 0.0803 | 0.0 | 0.0067 | 0.9277 | 0.8277 | 0.8500 | 0.9975 | 0.0573 | 0.9880 | 0.0 | 0.5980 | inf | 0.0280 | 0.3639 |
| 0.0002 | 9.3707 | 95000 | 0.3829 | inf | 0.9406 | 0.0 | 0.9850 | 21246458923512745279768788302023303751591538440766017756987392.0000 | 1.7691 | inf | 0.9406 | 0.6318 | 0.1466 | 0.6056 | 0.7951 | 0.9979 | 0.0 | 5.9969 | 3.0002 | 0.9993 | 0.9872 | -0.0028 | inf | 0.0802 | 0.0 | 0.0080 | 0.9279 | 0.8282 | 0.8501 | 0.9979 | 0.0564 | 0.9872 | 0.0 | 0.5981 | inf | inf | 0.3920 |
| 0.0002 | 9.6173 | 97500 | 0.3829 | inf | 0.9394 | 0.0 | 0.9849 | 22131728045325776452029795540104264829431683329922529862090752.0000 | 1.8058 | inf | 0.9394 | 0.6317 | 0.1543 | 0.6054 | 0.7955 | 0.9978 | 0.0 | 6.1170 | 3.0002 | 0.9993 | 0.9869 | -0.0018 | inf | 0.0817 | 0.0 | 0.0085 | 0.9265 | 0.8282 | 0.8499 | 0.9978 | 0.0576 | 0.9869 | 0.0 | 0.5980 | inf | inf | 0.4062 |
| 0.0002 | 9.8639 | 100000 | 0.3829 | inf | 0.9394 | 0.0 | 0.9847 | inf | 1.8056 | inf | 0.9394 | 0.6318 | 0.1547 | 0.6055 | 0.7955 | 0.9978 | 0.0 | 6.1064 | 3.0001 | 0.9992 | 0.9871 | -0.0018 | inf | 0.0815 | 0.0 | 0.0085 | 0.9267 | 0.8283 | 0.8496 | 0.9978 | 0.0577 | 0.9871 | 0.0 | 0.5981 | inf | inf | 0.4042 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.0
- Tokenizers 0.19.1
- Downloads last month
- 36
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for ailab-bio/PROTAC-Splitter-EncoderDecoder-lr_cosine
Base model
seyonec/ChemBERTa-zinc-base-v1