ailab-bio/PROTAC-Splitter-EncoderDecoder-lr_reduce-opt25-rand-smiles
This model is a fine-tuned version of seyonec/ChemBERTa-zinc-base-v1 on the ailab-bio/PROTAC-Splitter-Dataset dataset. It achieves the following results on the evaluation set:
- Loss: 0.3251
- All Ligands Equal: 0.5455
- E3 Tanimoto Similarity: 0.0
- E3 Graph Edit Distance Norm: inf
- E3 Heavy Atoms Difference Norm: 0.0123
- Linker Tanimoto Similarity: 0.0
- Tanimoto Similarity: 0.0
- E3 Valid: 0.9866
- Linker Heavy Atoms Difference Norm: -0.0013
- Num Fragments: 2.9997
- Heavy Atoms Difference: 5.7446
- E3 Has Attachment Point(s): 0.9866
- E3 Equal: 0.8076
- Reassembly: 0.5548
- Poi Has Attachment Point(s): 0.9458
- Poi Graph Edit Distance Norm: inf
- Linker Graph Edit Distance: inf
- Poi Heavy Atoms Difference Norm: 0.0510
- Linker Has Attachment Point(s): 0.9952
- Poi Equal: 0.7632
- E3 Heavy Atoms Difference: 0.4690
- Has All Attachment Points: 0.9866
- Linker Graph Edit Distance Norm: inf
- Has Three Substructures: 0.9990
- Linker Equal: 0.7856
- Poi Valid: 0.9458
- Valid: 0.9308
- Poi Tanimoto Similarity: 0.0
- Reassembly Nostereo: 0.5789
- Linker Valid: 0.9952
- Heavy Atoms Difference Norm: 0.0744
- Linker Heavy Atoms Difference: 0.2002
- Poi Heavy Atoms Difference: 1.7548
- Poi Graph Edit Distance: inf
- E3 Graph Edit Distance: inf
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: reduce_lr_on_plateau
- lr_scheduler_warmup_steps: 800
- training_steps: 10000
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | All Ligands Equal | E3 Tanimoto Similarity | E3 Graph Edit Distance Norm | E3 Heavy Atoms Difference Norm | Linker Tanimoto Similarity | Tanimoto Similarity | E3 Valid | Linker Heavy Atoms Difference Norm | Num Fragments | Heavy Atoms Difference | E3 Has Attachment Point(s) | E3 Equal | Reassembly | Poi Has Attachment Point(s) | Poi Graph Edit Distance Norm | Linker Graph Edit Distance | Poi Heavy Atoms Difference Norm | Linker Has Attachment Point(s) | Poi Equal | E3 Heavy Atoms Difference | Has All Attachment Points | Linker Graph Edit Distance Norm | Has Three Substructures | Linker Equal | Poi Valid | Valid | Poi Tanimoto Similarity | Reassembly Nostereo | Linker Valid | Heavy Atoms Difference Norm | Linker Heavy Atoms Difference | Poi Heavy Atoms Difference | Poi Graph Edit Distance | E3 Graph Edit Distance |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.0107 | 0.4932 | 5000 | 0.3008 | 0.5011 | 0.0 | inf | 0.0105 | 0.0 | 0.0 | 0.9859 | 0.0173 | 3.0002 | 4.6779 | 0.9859 | 0.7894 | 0.5097 | 0.9624 | inf | inf | 0.0317 | 0.9978 | 0.7466 | 0.4474 | 0.9880 | inf | 0.9993 | 0.7242 | 0.9624 | 0.9481 | 0.0 | 0.5343 | 0.9978 | 0.0617 | 0.5834 | 1.1262 | inf | inf |
| 0.0058 | 0.7398 | 7500 | 0.3187 | 0.5325 | 0.0 | inf | 0.0172 | 0.0 | 0.0 | 0.9835 | 0.0039 | 2.9999 | 4.7889 | 0.9835 | 0.8015 | 0.5402 | 0.9616 | inf | 59313031161473085686992099539504630098717768674496038276431872.0000 | 0.0264 | 0.9941 | 0.7586 | 0.6172 | 0.9898 | inf | 0.9992 | 0.7643 | 0.9616 | 0.9417 | 0.0 | 0.5643 | 0.9941 | 0.0626 | 0.2995 | 0.9886 | inf | inf |
| 0.0049 | 0.9864 | 10000 | 0.3251 | 0.5455 | 0.0 | inf | 0.0123 | 0.0 | 0.0 | 0.9866 | -0.0013 | 2.9997 | 5.7446 | 0.9866 | 0.8076 | 0.5548 | 0.9458 | inf | inf | 0.0510 | 0.9952 | 0.7632 | 0.4690 | 0.9866 | inf | 0.9990 | 0.7856 | 0.9458 | 0.9308 | 0.0 | 0.5789 | 0.9952 | 0.0744 | 0.2002 | 1.7548 | inf | inf |
Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.0
- Tokenizers 0.19.1
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for ailab-bio/PROTAC-Splitter-EncoderDecoder-lr_reduce-opt25-rand-smiles
Base model
seyonec/ChemBERTa-zinc-base-v1