LLMs in Production

by Christopher Brousseau, Matt Sharp

Artificial Intelligence

Book Details

Book Title

LLMs in Production

Author

Christopher Brousseau, Matt Sharp

Publisher

Manning City: Shelter Island, NY

Publication Date

2024

ISBN

9781633437203

Number of Pages

456

Language

English

Format

PDF

File Size

4.8MB

Subject

Large Language Models (LLMs)

Table of Contents

  • brief contents
  • contents
  • foreword
  • preface
  • acknowledgments
  • about the book
  • Who should read this book
  • How this book is organized
  • About the code
  • liveBook Discussion Forum
  • about the authors
  • about the cover illustration
  • Chapter 1: Words’ awakening: Why large language models have captured attention
  • 1.1 Large language models accelerating communication
  • 1.2 Navigating the build-and-buy decision with LLMs
  • 1.3 Debunking myths
  • Summary
  • Chapter 2: Large language models: A deep dive into language modeling
  • 2.1 Language modeling
  • 2.2 Language modeling techniques
  • 2.3 Attention is all you need
  • 2.4 Really big transformers
  • Summary
  • Chapter 3: Large language model operations: Building a platform for LLMs
  • 3.1 Introduction to large language model operations
  • 3.2 Operations challenges with large language models
  • 3.3 LLMOps essentials
  • 3.4 LLM operations infrastructure
  • Summary
  • Chapter 4: Data engineering for large language models: Setting up for success
  • 4.1 Models are the foundation
  • 4.2 Evaluating LLMs
  • 4.3 Data for LLMs
  • 4.4 Text processors
  • 4.5 Preparing a Slack dataset
  • Summary
  • Chapter 5: Training large language models: How to generate the generator
  • 5.1 Multi-GPU environments
  • 5.2 Basic training techniques
  • 5.3 Advanced training techniques
  • 5.4 Training tips and tricks
  • Summary
  • Chapter 6: Large language model services: A practical guide
  • 6.1 Creating an LLM service
  • 6.2 Setting up infrastructure
  • 6.3 Production challenges
  • 6.4 Deploying to the edge
  • Summary
  • Chapter 7: Prompt engineering: Becoming an LLM whisperer
  • 7.1 Prompting your model
  • 7.2 Prompt engineering basics
  • 7.3 Prompt engineering tooling
  • 7.4 Advanced prompt engineering techniques
  • Summary
  • Chapter 8: Large language model applications: Building an interactive experience
  • 8.1 Building an application
  • 8.2 Edge applications
  • 8.3 LLM agents
  • Summary
  • Chapter 9: Creating an LLM project: Reimplementing Llama 3
  • 9.1 Implementing Meta’s Llama
  • 9.2 Simple Llama
  • 9.3 Making it better
  • 9.4 Deploy to a Hugging Face Hub Space
  • Summary
  • Chapter 10: Creating a coding copilot project: This would have helped you earlier
  • 10.1 Our model
  • 10.2 Data is king
  • 10.3 Build the VS Code extension
  • 10.4 Lessons learned and next steps
  • Summary
  • Chapter 11: Deploying an LLM on a Raspberry Pi: How low can you go?
  • 11.1 Setting up your Raspberry Pi
  • 11.2 Preparing the model
  • 11.3 Serving the model
  • 11.4 Improvements
  • Summary
  • Chapter 12: Production, an ever-changing landscape: Things are just getting started
  • 12.1 A thousand-foot view
  • 12.2 The future of LLMs
  • 12.3 Final thoughts
  • Summary
  • Appendix A: History of linguistics
  • A.1 Ancient linguistics
  • A.2 Medieval linguistics
  • A.3 Renaissance and early modern linguistics
  • A.4 Early 20th-century linguistics
  • A.5 Mid-20th century and modern linguistics
  • Appendix B: Reinforcement learning with human feedback
  • Appendix C: Multimodal latent spaces
  • Index