The model in unit1 dummy_agent_library is not a chat model

#128
by RohitVDev - opened

Trying to run this snippet but I get this error Bad request: {'message': "The requested model 'meta-llama/Llama-4-Scout-17B-16E-Instruct' is not a chat model.", 'type': 'invalid_request_error', 'param': 'model', 'code': 'model_not_supported'}

I made do with another model but just wanted to point this out.

load_dotenv()

client = InferenceClient(model="meta-llama/Llama-4-Scout-17B-16E-Instruct", api_key=os.getenv("HF_KEY"))

output = client.chat.completions.create(
    messages=[
        {"role": "user", "content": "The capital of France is"},
    ],
    stream=False,
    max_tokens=20,
)

print(output.choices[0].message.content)

Use this model

meta-llama/Meta-Llama-3-8B-Instruct

Sign up or log in to comment