BEREL / README.md
Shaltiel's picture
Update README.md
029fa61 verified
metadata
license: apache-2.0
language:
  - he
library_name: transformers
tags:
  - bert

Update 2025-5-12: This model is BEREL version 1.0. We are now happy to provide a much improved BEREL_3.0.

Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language

When using BEREL, please reference:

Avi Shmidman, Joshua Guedalia, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Eli Handel, Moshe Koppel, "Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language", Aug 2022 [arXiv:2208.01875]

  1. Usage:
from transformers import AutoTokenizer, BertForMaskedLM

tokenizer = AutoTokenizer.from_pretrained('dicta-il/BEREL')
model = BertForMaskedLM.from_pretrained('dicta-il/BEREL')

# for evaluation, disable dropout
model.eval()