Fundamentals of Robust Machine Learning

by Resve A. Saleh, Sohaib Majzoub, A. K. Md. Ehsanes Saleh

Artificial Intelligence

Book Details

Book Title

Fundamentals of Robust Machine Learning

Author

Resve A. Saleh, Sohaib Majzoub, A. K. Md. Ehsanes Saleh

Publisher

Wiley

Publication Date

2025

ISBN

9781394294374

Number of Pages

409

Language

English

Format

PDF

File Size

3.6MB

Subject

MachineLearning ; Robustness

Table of Contents

  • Cover
  • Table of Contents
  • Title Page
  • Copyright
  • Dedication
  • Preface
  • About the Companion Website
  • Chapter 1. Introduction
  • 1.1 Defining Outliers
  • 1.2 Overview of the Book
  • 1.3 What Is Robust Machine Learning?
  • 1.4 Robustness of the Median
  • 1.5 and Norms
  • 1.6 Review of Gaussian Distribution
  • 1.7 Unsupervised Learning Case Study
  • 1.8 Creating Synthetic Data for Clustering
  • 1.9 Clustering Algorithms
  • 1.10 Importance of Robust Clustering
  • 1.11 Summary
  • Problems
  • References
  • Notes
  • Chapter 2. Robust Linear Regression
  • 2.1 Introduction
  • 2.2 Supervised Learning
  • 2.3 Linear Regression
  • 2.4 Importance of Residuals
  • 2.5 Estimation Background
  • 2.6 M‐Estimation
  • 2.7 Least Squares Estimation (LSE)
  • 2.8 Least Absolute Deviation (LAD)
  • 2.9 Comparison of LSE and LAD
  • 2.10 Huber's Method
  • 2.11 Summary
  • Problems
  • References
  • Chapter 3. The Log‐Cosh Loss Function
  • 3.1 Introduction
  • 3.2 An Intuitive View of Log‐Cosh
  • 3.3 Hyperbolic Functions
  • 3.4 M‐Estimation
  • 3.5 Deriving the Distribution for Log‐Cosh
  • 3.6 Standard Errors for Robust Estimators
  • 3.7 Statistical Properties of Log‐Cosh Loss
  • 3.8 A General Log‐Cosh Loss Function
  • 3.9 Summary
  • Problems
  • References
  • Notes
  • Chapter 4. Outlier Detection, Metrics, and Standardization
  • 4.1 Introduction
  • 4.2 Effect of Outliers
  • 4.3 Outlier Diagnosis
  • 4.4 Outlier Detection
  • 4.5 Outlier Removal
  • 4.6 Regression‐Based Outlier Detection
  • 4.7 Regression‐Based Outlier Removal
  • 4.8 Regression Metrics with Outliers
  • 4.9 Dataset Standardization
  • 4.10 Summary
  • Problems
  • References
  • Notes
  • Chapter 5. Robustness of Penalty Estimators
  • 5.1 Introduction
  • 5.2 Penalty Functions
  • 5.3 Ridge Penalty
  • 5.4 LASSO Penalty
  • 5.5 Effect of Penalty Functions
  • 5.6 Penalty Functions with Outliers
  • 5.7 Ridge Traces
  • 5.8 Elastic Net (Enet) Penalty
  • 5.9 Adaptive LASSO (aLASSO) Penalty
  • 5.10 Penalty Effects on Variance and Bias
  • 5.11 Variable Importance
  • 5.12 Summary
  • Problems
  • References
  • Notes
  • Chapter 6. Robust Regularized Models
  • 6.1 Introduction
  • 6.2 Overfitting and Underfitting
  • 6.3 The Bias–Variance Trade‐Off
  • 6.4 Regularization with Ridge
  • 6.5 Generalization using Robust Estimators
  • 6.6 Robust Generalization and Regularization
  • 6.7 Model Complexity
  • 6.8 Summary
  • Problems
  • References
  • Notes
  • Chapter 7. Quantile Regression Using Log‐Cosh
  • 7.1 Introduction
  • 7.2 Understanding Quantile Regression
  • 7.3 The Crossing Problem
  • 7.4 Standard Quantile Loss Function
  • 7.5 Smooth Regression Quantiles (SMRQ)
  • 7.6 Evaluation of Quantile Methods
  • 7.7 Selection of Robustness Coefficient
  • 7.8 Maximum‐Likelihood Procedure for SMRQ
  • 7.9 Standard Error Computation
  • 7.10 Summary
  • Problems
  • References
  • Chapter 8. Robust Binary Classification
  • 8.1 Introduction
  • 8.2 Binary Classification Problem
  • 8.3 The Cross‐Entropy (CE) Loss
  • 8.4 The Log‐Cosh (LC) Loss Function
  • 8.5 Algorithms for Logistic Regression
  • 8.6 Example: Motor Trend Cars
  • 8.7 Regularization of Logistic Regression
  • 8.8 Example: Circular Dataset
  • 8.9 Outlier Detection
  • 8.10 Robustness of Binary Classifiers
  • 8.11 Summary
  • Problems
  • Reference
  • Notes
  • Chapter 9. Neural Networks Using Log‐Cosh
  • 9.1 Introduction
  • 9.2 A Brief History of Neural Networks
  • 9.3 Defining Neural Networks
  • 9.4 Training of Neural Networks
  • 9.5 Forward and Backward Propagation
  • 9.6 Cross‐entropy and Log‐Cosh Algorithms
  • 9.7 Example: Circular Dataset
  • 9.8 Classification Metrics and Outliers
  • 9.9 Summary
  • Problems
  • References
  • Notes
  • Chapter 10. Multi‐class Classification and Adam Optimization
  • 10.1 Introduction
  • 10.2 Multi‐class Classification
  • 10.3 Example: MNIST Dataset
  • 10.4 Optimization of Neural Networks
  • 10.5 Summary
  • Problems
  • References
  • Notes
  • Chapter 11. Anomaly Detection and Evaluation Metrics
  • 11.1 Introduction
  • 11.2 Anomaly Detection Methods
  • 11.3 Anomaly Detection Using MADmax
  • 11.4 Qualitative Evaluation Methods
  • 11.5 Quantitative Evaluation Methods
  • 11.6 Summary
  • Problems
  • Reference
  • Notes
  • Chapter 12. Case Studies in Data Science
  • 12.1 Introduction
  • 12.2 Example: Boston Housing Dataset
  • 12.3 Example: Titanic Dataset
  • 12.4 Application to Explainable Artificial Intelligence (XAI)
  • 12.5 Time Series Example: Climate Change
  • 12.6 Summary and Conclusions
  • Problems
  • References
  • Notes
  • Index
  • End User License Agreement