Deep Learning with JAX Edition: 1

by Grigory Sapunov

Artificial Intelligence

Book Details

Book Title

Deep Learning with JAX Edition: 1

Author

Grigory Sapunov

Publisher

Manning City: Shelter Island, NY

Publication Date

2024

ISBN

9781633438880

Number of Pages

410

Language

English

Format

PDF

File Size

3MB

Subject

Computers > Cybernetics: Artificial Intelligence

Table of Contents

  • Deep Learning with JAX
  • brief contents
  • contents
  • preface
  • acknowledgments
  • about this book
  • about the author
  • about the cover illustration
  • Part 1
  • Chapter 1: When and why to use JAX
  • 1.1 Reasons to use JAX
  • 1.2 How is JAX different from NumPy?
  • 1.3 How is JAX different from TensorFlow and PyTorch?
  • Chapter 2: Your first program in JAX
  • 2.1 A toy ML problem: Classifying handwritten digits
  • 2.2 An overview of a JAX deep learning project
  • 2.3 Loading and preparing the dataset
  • 2.4 A simple neural network in JAX
  • 2.5 vmap: Auto-vectorizing calculations to work with batches
  • 2.6 Autodiff: How to calculate gradients without knowing about derivatives
  • 2.7 JIT: Compiling your code to make it faster
  • 2.8 Saving and deploying the model
  • 2.9 Pure functions and composable transformations: Why are they important?
  • Part 2
  • Chapter 3: Working with arrays
  • 3.1 Image processing with NumPy arrays
  • 3.2 Arrays in JAX
  • 3.3 Differences from NumPy
  • 3.4 High-level and low-level interfaces: jax.numpy and jax.lax
  • Chapter 4: Calculating gradients
  • 4.1 Different ways of getting derivatives
  • 4.2 Calculating gradients with autodiff
  • 4.3 Forward- and reverse-mode autodiff
  • Chapter 5: Compiling your code
  • 5.1 Using compilation
  • 5.2 JIT internals
  • 5.3 JIT limitations
  • Chapter 6: Vectorizing your code
  • 6.1 Different ways to vectorize a function
  • 6.2 Controlling vmap() behavior
  • 6.3 Real-life use cases for vmap()
  • Chapter 7: Parallelizing your computations
  • 7.1 Parallelizing computations with pmap()
  • 7.2 Controlling pmap() behavior
  • 7.3 Data-parallel neural network training example
  • 7.4 Using multihost configurations
  • Chapter 8: Using tensor sharding
  • 8.1 Basics of tensor sharding
  • 8.2 MLP with tensor sharding
  • Chapter 9: Random numbers in JAX
  • 9.1 Generating random data
  • 9.2 Differences with NumPy
  • 9.3 Generating random numbers in real-life applications
  • Chapter 10: Working with pytrees
  • 10.1 Representing complex data structures as pytrees
  • 10.2 Functions for working with pytrees
  • 10.3 Creating custom pytree nodes
  • Part 3
  • Chapter 11: Higher-level neural network libraries
  • 11.1 MNIST image classification using an MLP
  • 11.2 Image classification using a ResNet
  • 11.3 Using the Hugging Face ecosystem
  • Chapter 12: Other members of the JAX ecosystem
  • 12.1 Deep learning ecosystem
  • 12.2 Machine learning modules
  • 12.3 JAX modules for other fields
  • Appendix A: Installing JAX
  • Appendix B: Using Google Colab
  • Appendix C: Using Google Cloud TPUs
  • Appendix D: Experimental parallelization
  • Index