Instructions to use DunnBC22/codet5-small-Generate_Docstrings_for_Python with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use DunnBC22/codet5-small-Generate_Docstrings_for_Python with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("DunnBC22/codet5-small-Generate_Docstrings_for_Python") model = AutoModelForSeq2SeqLM.from_pretrained("DunnBC22/codet5-small-Generate_Docstrings_for_Python") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- c97aebd872c502a66b28acbb1b005dd30cc4e9a3f8b85607d08b113520e1cec5
- Size of remote file:
- 242 MB
- SHA256:
- 427a122265334261e0814d5ecac837ee50ed7528387ef91da5c7c640704502ca
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.