--- license: apache-2.0 language: - he library_name: transformers tags: - bert --- > Update 2025-5-12: This model is `BEREL` version 1.0. We are now happy to provide a much improved [BEREL_3.0](https://huggingface.co/dicta-il/BEREL_3.0). # Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language When using BEREL, please reference: Avi Shmidman, Joshua Guedalia, Shaltiel Shmidman, Cheyn Shmuel Shmidman, Eli Handel, Moshe Koppel, "Introducing BEREL: BERT Embeddings for Rabbinic-Encoded Language", Aug 2022 [arXiv:2208.01875] 1. Usage: ```python from transformers import AutoTokenizer, BertForMaskedLM tokenizer = AutoTokenizer.from_pretrained('dicta-il/BEREL') model = BertForMaskedLM.from_pretrained('dicta-il/BEREL') # for evaluation, disable dropout model.eval() ```