Skip to main content
CF

Local LLMs via Ollama & LM Studio - The Practical Guide

3h 52m 28s
English
Paid

Unlock the power of local language models with the practical guide to running AI models directly on your computer. Discover the advantages of using local AI solutions when privacy, cost, offline operation, and flexible customization are your priorities.

Course Overview

Embark on a journey to master the use of local AI models such as Llama by Meta, Gemma by Google, and DeepSeek. This course will show you how to operate these models on your laptop without relying on cloud services, subscriptions, or risking data privacy.

Why Choose Local and Open LLMs?

The dominance of cloud services can overshadow the benefits of local LLMs. Here's why they can be a better option:

  • No subscriptions - Access powerful models without any fees
  • 100% privacy - Keep all your data on your personal device
  • Offline capability - Use AI even when not connected to the internet
  • Vendor independence - Explore a fast-evolving ecosystem of open-source models
  • Cutting-edge performance - Local models are competing at the forefront of AI technology

What You Will Learn

Open LLMs

  • Where to find open LLMs, criteria for selecting them, and their significance

Technical Requirements

  • Understand the hardware necessary for running models locally

Model Quantization

  • Techniques for running resource-intensive AI models on standard computers

LM Studio

  • Guide to installation, setup, and use of various AI models

Ollama

  • Manage LLMs efficiently through the terminal or API

Hands-On Practice

  • Engage in practical applications like image processing, document summarization, text generation, and more

Project Integration

  • Learn how to incorporate AI models into your projects, complete with API and automation guidance

Who Should Enroll?

  • Developers looking to embed AI into their applications
  • Technological enthusiasts and students aiming to grasp the latest trends
  • Individuals prioritizing data privacy and control
  • Anyone interested in avoiding subscription costs while exploring the full capacity of modern LLMs

Join this course to harness local language models and revolutionize your AI skills. Embrace the future with complete control and privacy.

About the Author: Academind Pro (Maximilian Schwarzmüller)

Academind Pro (Maximilian Schwarzmüller) thumbnail

Academind is the teaching brand of Maximilian Schwarzmüller (Max) and Manuel Lorenz, two German developers whose Udemy catalog has become one of the largest single-instructor presences on that platform. Max in particular is widely cited as one of the clearest teachers of the JavaScript framework landscape — his Angular, React, Vue, and Node.js courses have collectively taught millions of students.

The Academind Pro platform extends beyond Udemy with deeper, more comprehensive courses aimed at developers building real applications rather than picking up syntax. Course material covers the full modern web stack: React (including Next.js), Vue, Angular, Node.js, NestJS, TypeScript, Docker, AWS, React Native, Flutter, and the broader full-stack JavaScript ecosystem.

The CourseFlix listing under this source carries over 25 Academind Pro courses spanning that range. Material is paid; Academind Pro runs on per-course pricing on the original platform. Courses are taught in Max's signature thorough, build-an-application-with-me style — long-form, deeply project-based, and continuously updated as the underlying frameworks evolve.

Watch Online 54 lessons

This is a demo lesson (10:00 remaining)

You can watch up to 10 minutes for free. Subscribe to unlock all 54 lessons in this course and access 10,000+ hours of premium content across all courses.

View Pricing
0:00
/
#1: Welcome To The Course!
All Course Lessons (54)
#Lesson TitleDurationAccess
1
Welcome To The Course! Demo
00:19
2
What Exactly Are "Open LLMs"?
06:28
3
Why Would You Want To Run Open LLMs Locally?
06:53
4
Popular Open LLMs - Some Examples
03:44
5
Where To Find Open LLMs?
04:48
6
Running LLMs Locally - Available Options
07:18
7
Check The Model Licenses!
04:05
8
Module Introduction
01:21
9
LLM Hardware Requirements - First Steps
04:22
10
Deriving Hardware Requirements From Model Parameters
05:35
11
Quantization To The Rescue!
06:51
12
Does It Run On Your Machine?
05:51
13
Module Introduction
02:04
14
Running Locally vs Remotely
01:09
15
Installing & Using LM Studio
03:10
16
Finding, Downloading & Activating Open LLMs
09:05
17
Using the LM Studio Chat Interface
04:54
18
Working with System Prompts & Presets
03:27
19
Managing Chats
02:33
20
Power User Features For Managing Models & Chats
06:29
21
Leveraging Multimodal Models & Extracting Content From Images (OCR)
02:49
22
Analyzing & Summarizing PDF Documents
03:28
23
Onwards To More Advanced Settings
01:53
24
Understanding Temperature, top_k & top_p
06:33
25
Controlling Temperature, top_k & top_p in LM Studio
04:46
26
Managing the Underlying Runtime & Hardware Configuration
04:18
27
Managing Context Length
05:22
28
Using Flash Attention
05:09
29
Working With Structured Outputs
05:30
30
Using Local LLMs For Code Generation
02:36
31
Content Generation & Few Shot Prompting (Prompt Engineering)
05:22
32
Onwards To Programmatic Use
02:26
33
LM Studio & Its OpenAI Compatibility
06:01
34
More Code Examples!
05:05
35
Diving Deeper Into The LM Studio APIs
02:11
36
Module Introduction
01:42
37
Installing & Starting Ollama
02:09
38
Finding Usable Open Models
02:57
39
Running Open LLMs Locally via Ollama
07:44
40
Adding a GUI with Open WebUI
02:13
41
Dealing with Multiline Messages & Image Input (Multimodality)
02:39
42
Inspecting Models & Extracting Model Information
03:32
43
Editing System Messages & Model Parameters
06:02
44
Saving & Loading Sessions and Models
03:36
45
Managing Models
05:43
46
Creating Model Blueprints via Modelfiles
06:23
47
Creating Models From Modelfiles
03:27
48
Making Sense of Model Templates
06:40
49
Building a Model From Scratch From a GGUF File
06:38
50
Getting Started with the Ollama Server (API)
02:13
51
Exploring the Ollama API & Programmatic Model Access
05:19
52
Getting Structured Output
02:57
53
More Code Examples!
04:54
54
Roundup
01:45
Unlock unlimited learning

Get instant access to all 53 lessons in this course, plus thousands of other premium courses. One subscription, unlimited knowledge.

Learn more about subscription

Related courses

Frequently asked questions

What is Local LLMs via Ollama & LM Studio - The Practical Guide about?
Unlock the power of local language models with the practical guide to running AI models directly on your computer. Discover the advantages of using local AI solutions when privacy , cost , offline operation , and flexible customization are…
Who teaches this course?
It is taught by Academind Pro (Maximilian Schwarzmüller). You can find more courses by this instructor on the corresponding source page.
How long is the course?
It contains 54 lessons with a total runtime of 3 hours 52 minutes. Every lesson is available to watch online at your own pace.
Is it free to watch?
It is part of CourseFlix's premium catalog. A subscription unlocks the full video player; the course description, table of contents, and preview information are available to everyone.
Where can I watch it online?
The course is available to watch online on CourseFlix at https://courseflix.net/course/local-llms-via-ollama-lm-studio-the-practical-guide. The page hosts every lesson with the integrated video player; no download is required.