Instructions to use facebook/deit-tiny-patch16-224 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use facebook/deit-tiny-patch16-224 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-classification", model="facebook/deit-tiny-patch16-224") pipe("https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png")# Load model directly from transformers import AutoImageProcessor, AutoModelForImageClassification processor = AutoImageProcessor.from_pretrained("facebook/deit-tiny-patch16-224") model = AutoModelForImageClassification.from_pretrained("facebook/deit-tiny-patch16-224") - Inference
- Notebooks
- Google Colab
- Kaggle
- Xet hash:
- c217fa34296282d581540098358f0159068eafbdee9a94309a482c416f76574a
- Size of remote file:
- 23 MB
- SHA256:
- e1a51b0c81ff812e079d2189352747947879fee56429e4480064a6695cb34b2c
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.