Minimax M2.7 NVFP4
🔥 4
3
#4 opened 8 days ago
by
mtcl
Add vllm to supported inference engine
#3 opened 21 days ago
by
wzhao18
Add vLLM as one of the supported inference engines.
#2 opened 22 days ago
by
wangshangsam
Update vLLM description
#1 opened 24 days ago
by
jeeejeee