This is a MoE model with a mix of domain agnostic fine-tuned models derived from the base Mistral

Downloads last month
4
Safetensors
Model size
24B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for collaiborate-tech/CollAIborate4x7B

Quantizations
1 model