Deep Learning for Coders With Fastai and Pytorch

by Jeremy Howard, Sylvain Gugger

Artificial Intelligence

Book Details

Book Title

Deep Learning for Coders With Fastai and Pytorch: Ai Applications Without a Phd

Author

Jeremy Howard, Sylvain Gugger

Publisher

Oreilly & Associates Inc

Publication Date

2020

ISBN

9781492045526

Number of Pages

1109

Language

English

Format

PDF

File Size

10.3 MB

Subject

Deep Learning

Table of Contents

  • Preface
  • Foreword
  • Part I. Deep Learning in Practice
  • Chapter 1. Your Deep Learning Journey
  • Deep Learning Is for Everyone
  • Neural Networks: A Brief History
  • Who We Are
  • How to Learn Deep Learning
  • The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter)
  • Your First Model
  • Deep Learning Is Not Just for Image Classification
  • Validation Sets and Test Sets
  • A Choose Your Own Adventure Moment
  • Questionnaire
  • Chapter 2. From Model to Production
  • The Practice of Deep Learning
  • Gathering Data
  • From Data to DataLoaders
  • Training Your Model, and Using It to Clean Your Data
  • Turning Your Model into an Online Application
  • How to Avoid Disaster
  • Get Writing!
  • Questionnaire
  • Chapter 3. Data Ethics
  • Key Examples for Data Ethics
  • Integrating Machine Learning with Product Design
  • Topics in Data Ethics
  • Identifying and Addressing Ethical Issues
  • Role of Policy
  • Conclusion
  • Questionnaire
  • Deep Learning in Practice: That’s a Wrap!
  • Part II. Understanding fastai’s Applications
  • Chapter 4. Under the Hood: Training a Digit Classifier
  • Pixels: The Foundations of Computer Vision
  • First Try: Pixel Similarity
  • Computing Metrics Using Broadcasting
  • Stochastic Gradient Descent
  • The MNIST Loss Function
  • Putting It All Together
  • Adding a Nonlinearity
  • Jargon Recap
  • Questionnaire
  • Chapter 5. Image Classification
  • From Dogs and Cats to Pet Breeds
  • Presizing
  • Cross-Entropy Loss
  • Model Interpretation
  • Improving Our Model
  • Conclusion
  • Questionnaire
  • Chapter 6. Other Computer Vision Problems
  • Multi-Label Classification
  • Regression
  • Conclusion
  • Questionnaire
  • Chapter 7. Training a State-of-the-Art Model
  • Imagenette
  • Normalization
  • Progressive Resizing
  • Test Time Augmentation
  • Mixup
  • Label Smoothing
  • Conclusion
  • Questionnaire
  • Chapter 8. Collaborative Filtering Deep Dive
  • A First Look at the Data
  • Learning the Latent Factors
  • Creating the DataLoaders
  • Collaborative Filtering from Scratch
  • Interpreting Embeddings and Biases
  • Bootstrapping a Collaborative Filtering Model
  • Deep Learning for Collaborative Filtering
  • Conclusion
  • Questionnaire
  • Chapter 9. Tabular Modeling Deep Dive
  • Categorical Embeddings
  • Beyond Deep Learning
  • The Dataset
  • Decision Trees
  • Random Forests
  • Model Interpretation
  • Extrapolation and Neural Networks
  • Ensembling
  • Conclusion
  • Questionnaire
  • Chapter 10. NLP Deep Dive: RNNs
  • Text Preprocessing
  • Training a Text Classifier
  • Disinformation and Language Models
  • Conclusion
  • Questionnaire
  • Chapter 11. Data Munging with fastai’s Mid-Level API
  • Going Deeper into fastai’s Layered API
  • TfmdLists and Datasets: Transformed Collections
  • Applying the Mid-Level Data API: SiamesePair
  • Conclusion
  • Questionnaire
  • Understanding fastai’s Applications: Wrap Up
  • Part III. Foundations of Deep Learning
  • Chapter 12. A Language Model from Scratch
  • The Data
  • Our First Language Model from Scratch
  • Improving the RNN
  • Multilayer RNNs
  • LSTM
  • Regularizing an LSTM
  • Conclusion
  • Questionnaire
  • Chapter 13. Convolutional Neural Networks
  • The Magic of Convolutions
  • Our First Convolutional Neural Network
  • Color Images
  • Improving Training Stability
  • Conclusion
  • Questionnaire
  • Chapter 14. ResNets
  • Going Back to Imagenette
  • Building a Modern CNN: ResNet
  • Conclusion
  • Questionnaire
  • Chapter 15. Application Architectures Deep Dive
  • Computer Vision
  • Natural Language Processing
  • Tabular
  • Conclusion
  • Questionnaire
  • Chapter 16. The Training Process
  • Establishing a Baseline
  • A Generic Optimizer
  • Momentum
  • RMSProp
  • Adam
  • Decoupled Weight Decay
  • Callbacks
  • Conclusion
  • Questionnaire
  • Foundations of Deep Learning: Wrap Up
  • Part IV. Deep Learning from Scratch
  • Chapter 17. A Neural Net from the Foundations
  • Building a Neural Net Layer from Scratch
  • The Forward and Backward Passes
  • Conclusion
  • Questionnaire
  • Chapter 18. CNN Interpretation with CAM
  • CAM and Hooks
  • Gradient CAM
  • Conclusion
  • Questionnaire
  • Chapter 19. A fastai Learner from Scratch
  • Data
  • Module and Parameter
  • Loss
  • Learner
  • Conclusion
  • Questionnaire
  • Chapter 20. Concluding Thoughts
  • Appendices
  • A. Creating a Blog
  • B. Data Project Checklist
  • Index