Skip to main content

Building LLMs for Production

0h 0m 0s
English
Paid

Course description

"Creating LLM for Production" is a practical 470-page guide (updated in October 2024) designed for developers and specialists who want to go beyond prototyping and build reliable, industry-ready applications based on large language models.

The book explains the fundamentals of how LLMs work and thoroughly examines key techniques: advanced prompting, Retrieval-Augmented Generation (RAG), model fine-tuning, evaluation methods, and deployment strategies. Readers gain access to interactive Colab notebooks, real code examples, and case studies demonstrating how to integrate LLMs into products and workflows in practice. Special attention is given to issues of security, monitoring, optimization, and cost reduction.

Books

Read Book Building LLMs for Production

#Title
1Table of Contents
2About The Book
3Introduction
4Why Prompt Engineering, Fine-Tuning, and RAG?
5Coding Environment and Packages
6A Brief History of Language Models
7What are Large Language Models?
8Building Blocks of LLMs
9Tutorial: Translation with LLMs (GPT-3.5 API)
10Tutorial: Control LLMs Output with Few-Shot Learning
11Recap
12Understanding Transformers
13Transformer Model’s Design Choices
14Transformer Architecture Optimization Techniques
15The Generative Pre-trained Transformer (GPT) Architecture
16Introduction to Large Multimodal Models
17Proprietary vs. Open Models vs. Open-Source Language Models
18Applications and Use-Cases of LLMs
19Recap
20Understanding Hallucinations and Bias
21Reducing Hallucinations by Controlling LLM Outputs
22Evaluating LLM Performance
23Recap
24Prompting and Prompt Engineering
25Prompting Techniques
26Prompt Injection and Security
27Recap
28Why RAG?
29Building a Basic RAG Pipeline from Scratch
30Recap
31LLM Frameworks
32LangChain Introduction
33Tutorial 1: Building LLM-Powered Applications with LangChain
34Tutorial 2: Building a News Articles Summarizer
35LlamaIndex Introduction
36LangChain vs. LlamaIndex vs. OpenAI Assistants
37Recap
38What are LangChain Prompt Templates
39Few-Shot Prompts and Example Selectors
40What are LangChain Chains
41Tutorial 1: Managing Outputs with Output Parsers
42Tutorial 2: Improving Our News Articles Summarizer
43Tutorial 3: Creating Knowledge Graphs from Textual Data: Finding Hidden Connections
44Recap
45LangChain’s Indexes and Retrievers
46Data Ingestion
47Text Splitters
48Similarity Search and Vector Embeddings
49Tutorial 1: A Customer Support Q&A Chatbot
50Tutorial 2: A YouTube Video Summarizer Using Whisper and LangChain
51Tutorial 3: A Voice Assistant for Your Knowledge Base
52Tutorial 4: Preventing Undesirable Outputs with the Self-Critique Chain
53Tutorial 5: Preventing Undesirable Outputs from a Customer Service Chatbot
54Recap
55From Proof of Concept to Product: Challenges of RAG Systems
56Advanced RAG Techniques with LlamaIndex
57RAG - Metrics & Evaluation
58LangChain LangSmith and LangChain Hub
59Recap
60What are Agents: Large Models as Reasoning Engines
61An Overview of AutoGPT and BabyAGI
62The Agent Simulation Projects in LangChain
63Tutorial 1: Building Agents for Analysis Report Creation
64Tutorial 2: Query and Summarize a DB with LlamaIndex
65Tutorial 3: Building Agents with OpenAI Assistants
66Tutorial 4: LangChain OpenGPT
67Tutorial 5: Multimodal Financial Document Analysis from PDFs
68Recap
69Understanding Fine-Tuning
70Low-Rank Adaptation (LoRA)
71Tutorial 1: SFT with LoRA
72Tutorial 2: Using SFT and LoRA for Financial Sentiment
73Tutorial 3: Fine-Tuning a Cohere LLM with Medical Data
74Reinforcement Learning from Human Feedback
75Tutorial 4: Improving LLMs with RLHF
76Recap
77Model Distillation and Teacher-Student Models
78LLM Deployment Optimization: Quantization, Pruning, and Speculative Decoding
79Tutorial: Deploying a Quantized LLM on a CPU on Google Cloud Platform (GCP)
80Deploying Open-Source LLMs on Cloud Providers
81Recap
82Conclusion
83Further Reading and Courses

Comments

0 comments

Want to join the conversation?

Sign in to comment

Similar courses

Build AI-Powered Apps – An AI Course for Developers

Build AI-Powered Apps – An AI Course for Developers

Sources: codewithmosh (Mosh Hamedani)
AI is everywhere - but can you really create applications with it? Most developers have tried ChatGPT. Some have even inserted pieces...
7 hours 3 minutes 31 seconds
Full-Stack Project with Claude Code

Full-Stack Project with Claude Code

Sources: Mckay Wrigley (takeoff)
In this workshop, participants step by step create an MVP clone of FigJam - a visual collaboration editor - using Claude Code, Opus 4, Cursor IDE, and...
1 hour 12 minutes 14 seconds
The NotebookLM Guide: Your AI-Powered Productivity Assistant

The NotebookLM Guide: Your AI-Powered Productivity Assistant

Sources: zerotomastery.io
Learn to use NotebookLM from Google to simplify research, analyze content, and boost productivity. From automatic summaries to...
2 hours 3 minutes 22 seconds
How To Connect, Code & Debug Supabase With Bolt

How To Connect, Code & Debug Supabase With Bolt

Sources: newline (ex fullstack.io)
This workshop is a continuation of the course "Overnight Fullstack Applications". In the recording, you will learn how to connect your applications in Bolt...
42 minutes
Claude Code

Claude Code

Sources: Mckay Wrigley (takeoff)
Claude Code is a course that teaches how to use the intelligent assistant (AI) from Anthropic for programming directly in the terminal. It helps write...
2 hours 23 minutes 22 seconds