Unable to load the model
#1
by
pyRis
- opened
While using the suggested method to use this model:
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("yutaozhu94/INTERS-LLaMA-7b-chat")
model = AutoModelForCausalLM.from_pretrained("yutaozhu94/INTERS-LLaMA-7b-chat")
I am getting this error:
OSError: Unable to load weights from pytorch checkpoint file for '/projects/xxyy/pyris/HF_TEMP/hub/models--yutaozhu94--INTERS-LLaMA-7b-chat/snapshots/a16b15d7dd2a793ef43aca8603aa07e252aef7fb/pytorch_model-00001.bin' at '/projects/xxyy/pyris/HF_TEMP/hub/models--yutaozhu94--INTERS-LLaMA-7b-chat/snapshots/a16b15d7dd2a793ef43aca8603aa07e252aef7fb/pytorch_model-00001.bin'. If you tried to load a PyTorch model from a TF 2.0 checkpoint, please set from_tf=True.
This is after I have set the tokeniser to load from the original Lllama-7 b-chat-hf model because the files in this repo have no tokenizer.
Nevermind, I just needed to set weights_only flag.
pyRis
changed discussion status to
closed