Luminum-v0.1-123B quantized with MLC-LLM down to q4f16_1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for TNT3530/Luminum-v0.1-123B-q4f16_1-MLC
Base model
FluffyKaeloky/Luminum-v0.1-123B