magnum-v4-12b-MLC
magnum-v4-12b-MLC for web-llm
Chat config: https://huggingface.co/oopus/magnum-v4-12b-MLC/blob/main/mlc-chat-config.json
webgpu library: https://huggingface.co/oopus/magnum-v4-12b-MLC/blob/main/magnum-v4-12b-q4f16_1-webgpu.wasm
How to use it: https://github.com/mlc-ai/web-llm/blob/767e1100b0d850b6157ef1ef6a01137508458ff8/examples/get-started/src/get_started.ts#L31
Thanks for anthracite-org's amazing model: https://huggingface.co/anthracite-org/magnum-v4-12b