MLC LLM
Checked 55m agoLink OKFree plan available
Compiler framework that compiles large language models to run on any hardware from phones to browsers to servers. Enables deploying LLMs to edge devices and browsers without server infrastructure. Supports iOS, Android, WebGPU, and many other platforms. Dramatically reduces latency by running models locally. Reduces costs by eliminating cloud API calls. Open source and backed by major tech companies. Revolutionary for bringing AI to the edge.
Comments
Sign in to add a comment. Your account must be at least 1 day old.