ShipItAndPray/mcp-turboquant

LLM quantization via tool call. Convert models to GGUF, GPTQ, and AWQ formats. Recommend optimal quant settings, evaluate quality, and push to Hugging Face Hub.

Category
Data Science Tools
Language
Python
License
MIT
Source
https://github.com/ShipItAndPray/mcp-turboquant

Related MCP Servers

Compare