This course is free. Create a free account to learn, save your progress, and earn a certificate when you complete it.
Local AI for Builders
FreeLearn how to run, configure, and build applications on local LLMs using tools like Ollama, LM Studio, Open WebUI, and Continue. This course is for developers who want to use AI without sending data to external services: for privacy, cost control, offline operation, or full infrastructure ownership. You will learn how local models work, how to choose the right model for your hardware and task, how to integrate local LLMs into applications via the OpenAI-compatible API, how to engineer prompts for open-source models, how to evaluate and compare model quality, and how to deploy a self-hosted AI stack for yourself or your team.
No payment or subscription required. Sign in to track your learning and claim your certificate when you finish.
Complete lessons in order to unlock the next — structured progression.
Why Run AI Locally and What It Takes
Understand why developers choose local AI, what hardware you need, and what to expect from running models on your own machine.
- 1Why Run Ai Locally: The Case For Local LlmsTutorial
- 2Hardware Requirements For Local AiTutorial
- 3Local Ai Foundations CheckQuiz
Running Models with Ollama and LM Studio
Get Ollama and LM Studio running, download models, and choose the right model for your hardware and task.
- 4Getting Started With OllamaTutorial
- 5Lm Studio: A Gui For Local Model ManagementTutorial
- 6Choosing The Right Local Model For Your Use CaseTutorial
- 7Running Models CheckQuiz
Building Applications on Local LLMs
Connect your applications to local models via the OpenAI-compatible API, deploy Open WebUI, and use Continue for AI-assisted coding.
- 8Integrating Local Llms Into Your ApplicationsTutorial
- 9Open Webui: A Self Hosted Chat InterfaceTutorial
- 10Continue: Local Ai For Your IdeTutorial
- 11Building Apps CheckQuiz
Performance, Quality, and Reliability
Understand quantization, prompt open-source models effectively, and evaluate local model quality for your use case.
- 12Quantization: The Technical DetailsTutorial
- 13Prompt Engineering For Open Source ModelsTutorial
- 14Evaluating And Comparing Local ModelsTutorial
- 15Performance And Quality CheckQuiz
Production and Privacy Patterns
Deploy a self-hosted AI stack for your team, handle privacy and data correctly, and complete the capstone project.
- 16Deploying Local Ai To A Self Hosted ServerTutorial
- 17Privacy And Data Handling With Local AiTutorial
- 18Local Ai For Builders: Capstone ProjectTutorial
- 19Production Patterns CheckQuiz
Discussion
Sign in to comment. Your account must be at least 1 day old.