Update README.md
Browse files
README.md
CHANGED
|
@@ -7,7 +7,7 @@ tags:
|
|
| 7 |
- bert
|
| 8 |
---
|
| 9 |
|
| 10 |
-
> Update
|
| 11 |
|
| 12 |
|
| 13 |
# Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language
|
|
@@ -30,11 +30,3 @@ model = BertForMaskedLM.from_pretrained('dicta-il/BEREL')
|
|
| 30 |
# for evaluation, disable dropout
|
| 31 |
model.eval()
|
| 32 |
```
|
| 33 |
-
|
| 34 |
-
> NOTE: This code will **not** work and provide bad results if you use `BertTokenizer`. Please use `AutoTokenizer` or `BertTokenizerFast`.
|
| 35 |
-
|
| 36 |
-
2. Demo site:
|
| 37 |
-
You can experiment with the model in a GUI interface here: https://dicta-bert-demo.netlify.app/?genre=rabbinic
|
| 38 |
-
- The main part of the GUI consists of word buttons visualizing the tokenization of the sentences. Clicking on a button masks it, and then three BEREL word predictions are shown. Clicking on that bubble expands it to 10 predictions; alternatively, ctrl-clicking on that initial bubble expands to 30 predictions.
|
| 39 |
-
- Ctrl-clicking adjacent word buttons combines them into a single token for the mask.
|
| 40 |
-
- The edit box on top contains the input sentence; this can be modified at will, and the word-buttons will adjust as relevant.
|
|
|
|
| 7 |
- bert
|
| 8 |
---
|
| 9 |
|
| 10 |
+
> Update 2025-5-12: This model is `BEREL` version 1.0. We are now happy to provide a much improved [BEREL_3.0](https://huggingface.co/dicta-il/BEREL_3.0).
|
| 11 |
|
| 12 |
|
| 13 |
# Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language
|
|
|
|
| 30 |
# for evaluation, disable dropout
|
| 31 |
model.eval()
|
| 32 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|