This repository contains quantized GGUFs of Cerebras MiniMax-M2.5-REAP-172B-A10B, which in turn is a pruned version of MiniMax M2.5
No imatrix was used.
- Downloads last month
- 595
Hardware compatibility
Log In to add your hardware
4-bit
5-bit
6-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support
Model tree for wimmmm/MiniMax-M2.5-REAP-172B-A10B-GGUF
Base model
MiniMaxAI/MiniMax-M2.5