Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Machine and Deep Learning Using MATLAB - Kamal I. M. Al-Malah

Machine and Deep Learning Using MATLAB

Algorithms and Tools for Scientists and Engineers
Buch | Hardcover
592 Seiten
2023
John Wiley & Sons Inc (Verlag)
978-1-394-20908-8 (ISBN)
CHF 246,15 inkl. MwSt
  • Versand in 15-20 Tagen
  • Versandkostenfrei
  • Auch auf Rechnung
  • Artikel merken
MACHINE AND DEEP LEARNING In-depth resource covering machine and deep learning methods using MATLAB tools and algorithms, providing insights and algorithmic decision-making processes

Machine and Deep Learning Using MATLAB introduces early career professionals to the power of MATLAB to explore machine and deep learning applications by explaining the relevant MATLAB tool or app and how it is used for a given method or a collection of methods. Its properties, in terms of input and output arguments, are explained, the limitations or applicability is indicated via an accompanied text or a table, and a complete running example is shown with all needed MATLAB command prompt code.

The text also presents the results, in the form of figures or tables, in parallel with the given MATLAB code, and the MATLAB written code can be later used as a template for trying to solve new cases or datasets. Throughout, the text features worked examples in each chapter for self-study with an accompanying website providing solutions and coding samples. Highlighted notes draw the attention of the user to critical points or issues.

Readers will also find information on:



Numeric data acquisition and analysis in the form of applying computational algorithms to predict the numeric data patterns (clustering or unsupervised learning)
Relationships between predictors and response variable (supervised), categorically sub-divided into classification (discrete response) and regression (continuous response)
Image acquisition and analysis in the form of applying one of neural networks, and estimating net accuracy, net loss, and/or RMSE for the successive training, validation, and testing steps
Retraining and creation for image labeling, object identification, regression classification, and text recognition

Machine and Deep Learning Using MATLAB is a useful and highly comprehensive resource on the subject for professionals, advanced students, and researchers who have some familiarity with MATLAB and are situated in engineering and scientific fields, who wish to gain mastery over the software and its numerous applications.

Kamal I. M. Al-Malah received his PhD degree from Oregon State University in 1993. He served as a Professor of Chemical Engineering in Jordan and Gulf countries, as well as Former Chairman of the Chemical Engineering Department at the University of Hail in Saudi Arabia. Professor Al-Malah is an expert in both Aspen Plus® and MATLAB® applications. He has created a bundle of Windows-based software for engineering applications.

Preface xiii

About the Companion Website xvii

1 Unsupervised Machine Learning (ML) Techniques 1

Introduction 1

Selection of the Right Algorithm in ML 2

Classical Multidimensional Scaling of Predictors Data 2

Principal Component Analysis (PCA) 6

k-Means Clustering 13

Distance Metrics: Locations of Cluster Centroids 13

Replications 14

Gaussian Mixture Model (GMM) Clustering 15

Optimum Number of GMM Clusters 17

Observations and Clusters Visualization 18

Evaluating Cluster Quality 21

Silhouette Plots 22

Hierarchical Clustering 23

Step 1 -- Determine Hierarchical Structure 23

Step 2 -- Divide Hierarchical Tree into Clusters 25

PCA and Clustering: Wine Quality 27

Feature Selection Using Laplacian (fsulaplacian) for Unsupervised Learning 35

CHW 1.1 The Iris Flower Features Data 37

CHW 1.2 The Ionosphere Data Features 38

CHW 1.3 The Small Car Data 39

CHW 1.4 Seeds Features Data 40

2 ML Supervised Learning: Classification Models 42

Fitting Data Using Different Classification Models 42

Customizing a Model 43

Creating Training and Test Datasets 43

Predicting the Response 45

Evaluating the Classification Model 45

KNN Model for All Categorical or All Numeric Data Type 47

KNN Model: Heart Disease Numeric Data 48

Viewing the Fitting Model Properties 50

The Fitting Model: Number of Neighbors and Weighting Factor 51

The Cost Penalty of the Fitting Model 52

KNN Model: Red Wine Data 55

Using MATLAB Classification Learner 57

Binary Decision Tree Model for Multiclass Classification of All Data Types 68

Classification Tree Model: Heart Disease Numeric Data Types 70

Classification Tree Model: Heart Disease All Predictor Data Types 72

Naive Bayes Classification Model for All Data Types 74

Fitting Heart Disease Numeric Data to Naive Bayes Model 75

Fitting Heart Disease All Data Types to Naive Bayes Model 77

Discriminant Analysis (DA) Classifier for Numeric Predictors Only 79

Discriminant Analysis (DA): Heart Disease Numeric Predictors 82

Support Vector Machine (SVM) Classification Model for All Data Types 84

Properties of SVM Model 85

SVM Classification Model: Heart Disease Numeric Data Types 87

SVM Classification Model: Heart Disease All Data Types 90

Multiclass Support Vector Machine (fitcecoc) Model 92

Multiclass Support Vector Machines Model: Red Wine Data 95

Binary Linear Classifier (fitclinear) to High-Dimensional Data 98

CHW 2.1 Mushroom Edibility Data 100

CHW 2.2 1994 Adult Census Income Data 100

CHW 2.3 White Wine Classification 101

CHW 2.4 Cardiac Arrhythmia Data 102

CHW 2.5 Breast Cancer Diagnosis 102

3 Methods of Improving ML Predictive Models 103

Accuracy and Robustness of Predictive Models 103

Evaluating a Model: Cross-Validation 104

Cross-Validation Tune-up Parameters 105

Partition with K-Fold: Heart Disease Data Classification 106

Reducing Predictors: Feature Transformation and Selection 108

Factor Analysis 110

Feature Transformation and Factor Analysis: Heart Disease Data 113

Feature Selection 115

Feature Selection Using predictorImportance Function: Health Disease Data 116

Sequential Feature Selection (SFS): sequentialfs Function with Model Error Handler 118

Accommodating Categorical Data: Creating Dummy Variables 121

Feature Selection with Categorical Heart Disease Data 122

Ensemble Learning 126

Creating Ensembles: Heart Disease Data 130

Ensemble Learning: Wine Quality Classification 131

Improving fitcensemble Predictive Model: Abalone Age Prediction 132

Improving fitctree Predictive Model with Feature Selection (FS): Credit Ratings Data 134

Improving fitctree Predictive Model with Feature Transformation (FT): Credit Ratings Data 135

Using MATLAB Regression Learner 136

Feature Selection and Feature Transformation Using Regression Learner App 145

Feature Selection Using Neighborhood Component Analysis (NCA) for Regression: Big Car Data 146

CHW 3.1 The Ionosphere Data 148

CHW 3.2 Sonar Dataset 149

CHW 3.3 White Wine Classification 150

CHW 3.4 Small Car Data (Regression Case) 152

4 Methods of ML Linear Regression 153

Introduction 153

Linear Regression Models 154

Fitting Linear Regression Models Using fitlm Function 155

How to Organize the Data? 155

Results Visualization: Big Car Data 162

Fitting Linear Regression Models Using fitglm Function 164

Nonparametric Regression Models 166

fitrtree Nonparametric Regression Model: Big Car Data 167

Support Vector Machine, fitrsvm, Nonparametric Regression Model: Big Car Data 170

Nonparametric Regression Model: Gaussian Process Regression (GPR) 172

Regularized Parametric Linear Regression 176

Ridge Linear Regression: The Penalty Term 176

Fitting Ridge Regression Models 177

Predicting Response Using Ridge Regression Models 178

Determining Ridge Regression Parameter, λ 179

The Ridge Regression Model: Big Car Data 179

The Ridge Regression Model with Optimum λ: Big Car Data 181

Regularized Parametric Linear Regression Model: Lasso 183

Stepwise Parametric Linear Regression 186

Fitting Stepwise Linear Regression 187

How to Specify stepwiselm Model? 187

Stepwise Linear Regression Model: Big Car Data 188

CHW 4.1 Boston House Price 192

CHW 4.2 The Forest Fires Data 193

CHW 4.3 The Parkinson’s Disease Telemonitoring Data 194

CHW 4.4 The Car Fuel Economy Data 195

5 Neural Networks 197

Introduction 197

Feed-Forward Neural Networks 198

Feed-Forward Neural Network Classification 199

Feed-Forward Neural Network Regression 200

Numeric Data: Dummy Variables 200

Neural Network Pattern Recognition (nprtool) Application 201

Command-Based Feed-Forward Neural Network Classification: Heart Data 210

Neural Network Regression (nftool) 214

Command-Based Feed-Forward Neural Network Regression: Big Car Data 223

Training the Neural Network Regression Model Using fitrnet Function: Big Car Data 226

Finding the Optimum Regularization Strength for Neural Network Using Cross-Validation: Big Car Data 229

Custom Hyperparameter Optimization in Neural Network Regression: Big Car Data 231

CHW 5.1 Mushroom Edibility Data 233

CHW 5.2 1994 Adult Census Income Data 233

CHW 5.3 Breast Cancer Diagnosis 234

CHW 5.4 Small Car Data (Regression Case) 234

CHW 5.5 Boston House Price 235

6 Pretrained Neural Networks: Transfer Learning 237

Deep Learning: Image Networks 237

Data Stores in MATLAB 241

Image and Augmented Image Datastores 243

Accessing an Image File 246

Retraining: Transfer Learning for Image Recognition 247

Convolutional Neural Network (CNN) Layers: Channels and Activations 256

Convolution 2-D Layer Features via Activations 258

Extraction and Visualization of Activations 261

A 2-D (or 2-D Grouped) Convolutional Layer 264

Features Extraction for Machine Learning 267

Image Features in Pretrained Convolutional Neural Networks (CNNs) 268

Classification with Machine Learning 268

Feature Extraction for Machine Learning: Flowers 269

Pattern Recognition Network Generation 271

Machine Learning Feature Extraction: Spectrograms 275

Network Object Prediction Explainers 278

Occlusion Sensitivity 278

imageLIME Features Explainer 282

gradCAM Features Explainer 284

HCW 6.1 CNN Retraining for Round Worms Alive or Dead Prediction 286

HCW 6.2 CNN Retraining for Food Images Prediction 286

HCW 6.3 CNN Retraining for Merchandise Data Prediction 287

HCW 6.4 CNN Retraining for Musical Instrument Spectrograms Prediction 288

HCW 6.5 CNN Retraining for Fruit/Vegetable Varieties Prediction 289

7 A Convolutional Neural Network (CNN) Architecture and Training 290

A Simple CNN Architecture: The Land Satellite Images 291

Displaying Satellite Images 291

Training Options 294

Mini Batches 295

Learning Rates 296

Gradient Clipping 297

Algorithms 298

Training a CNN for Landcover Dataset 299

Layers and Filters 302

Filters in Convolution Layers 307

Viewing Filters: AlexNet Filters 308

Validation Data 311

Using shuffle Function 316

Improving Network Performance 319

Training Algorithm Options 319

Training Data 319

Architecture 320

Image Augmentation: The Flowers Dataset 322

Directed Acyclic Graphs Networks 329

Deep Network Designer (DND) 333

Semantic Segmentation 342

Analyze Training Data for Semantic Segmentation 343

Create a Semantic Segmentation Network 345

Train and Test the Semantic Segmentation Network 350

HCW 7.1 CNN Creation for Round Worms Alive or Dead Prediction 356

HCW 7.2 CNN Creation for Food Images Prediction 357

HCW 7.3 CNN Creation for Merchandise Data Prediction 358

HCW 7.4 CNN Creation for Musical Instrument Spectrograms Prediction 358

HCW 7.5 CNN Creation for Chest X-ray Prediction 359

HCW 7.6 Semantic Segmentation Network for CamVid Dataset 359

8 Regression Classification: Object Detection 361

Preparing Data for Regression 361

Modification of CNN Architecture from Classification to Regression 361

Root-Mean-Square Error 364

AlexNet-Like CNN for Regression: Hand-Written Synthetic Digit Images 364

A New CNN for Regression: Hand-Written Synthetic Digit Images 370

Deep Network Designer (DND) for Regression 374

Loading Image Data 375

Generating Training Data 375

Creating a Network Architecture 376

Importing Data 378

Training the Network 378

Test Network 383

YOLO Object Detectors 384

Object Detection Using YOLO v4 386

COCO-Based Creation of a Pretrained YOLO v4 Object Detector 387

Fine-Tuning of a Pretrained YOLO v4 Object Detector 389

Evaluating an Object Detector 394

Object Detection Using R-CNN Algorithms 396

R-CNN 397

Fast R-CNN 397

Faster R-CNN 398

Transfer Learning (Re-Training) 399

R-CNN Creation and Training 399

Fast R-CNN Creation and Training 403

Faster R-CNN Creation and Training 408

evaluateDetectionPrecision Function for Precision Metric 413

evaluateDetectionMissRate for Miss Rate Metric 417

HCW 8.1 Testing yolov4ObjectDetector and fasterRCNN Object Detector 424

HCW 8.2 Creation of Two CNN-based yolov4ObjectDetectors 424

HCW 8.3 Creation of GoogleNet-Based Fast R-CNN Object Detector 425

HCW 8.4 Creation of a GoogleNet-Based Faster R-CNN Object Detector 426

HCW 8.5 Calculation of Average Precision and Miss Rate Using GoogleNet-Based Faster R-CNN Object Detector 427

HCW 8.6 Calculation of Average Precision and Miss Rate Using GoogleNet-Based yolov4

Object Detector 427

HCW 8.7 Faster RCNN-based Car Objects Prediction and Calculation of Average Precision for Training and Test Data 427

9 Recurrent Neural Network (RNN) 430

Long Short-Term Memory (LSTM) and BiLSTM Network 430

Train LSTM RNN Network for Sequence Classification 437

Improving LSTM RNN Performance 441

Sequence Length 441

Classifying Categorical Sequences 445

Sequence-to-Sequence Regression Using Deep Learning: Turbo Fan Data 446

Classify Text Data Using Deep Learning: Factory Equipment Failure Text Analysis -- 1 453

Classify Text Data Using Deep Learning: Factory Equipment Failure Text Analysis -- 2 462

Word-by-Word Text Generation Using Deep Learning -- 1 465

Word-by-Word Text Generation Using Deep Learning -- 2 473

Train Network for Time Series Forecasting Using Deep Network Designer (DND) 475

Train Network with Numeric Features 486

HCW 9.1 Text Classification: Factory Equipment Failure Text Analysis 491

HCW 9.2 Text Classification: Sentiment Labeled Sentences Data Set 492

HCW 9.3 Text Classification: Netflix Titles Data Set 492

HCW 9.4 Text Regression: Video Game Titles Data Set 492

HCW 9.5 Multivariate Classification: Mill Data Set 493

HCW 9.6 Word-by-Word Text Generation Using Deep Learning 494

10 Image/Video-Based Apps 495

Image Labeler (IL) App 495

Creating ROI Labels 498

Creating Scene Labels 499

Label Ground Truth 500

Export Labeled Ground Truth 501

Video Labeler (VL) App: Ground Truth Data Creation, Training, and Prediction 502

Ground Truth Labeler (GTL) App 513

Running/Walking Classification with Video Clips using LSTM 520

Experiment Manager (EM) App 526

Image Batch Processor (IBP) App 533

HCW 10.1 Cat Dog Video Labeling, Training, and Prediction -- 1 537

HCW 10.2 Cat Dog Video Labeling, Training, and Prediction -- 2 537

HCW 10.3 EM Hyperparameters of CNN Retraining for Merchandise Data Prediction 538

HCW 10.4 EM Hyperparameters of CNN Retraining for Round Worms Alive or Dead Prediction 539

HCW 10.5 EM Hyperparameters of CNN Retraining for Food Images Prediction 540

Appendix A Useful MATLAB Functions 543

A.1 Data Transfer from an External Source into MATLAB 543

A.2 Data Import Wizard 543

A.3 Table Operations 544

A.4 Table Statistical Analysis 547

A.5 Access to Table Variables (Column Titles) 547

A.6 Merging Tables with Mixed Columns and Rows 547

A.7 Data Plotting 548

A.8 Data Normalization 549

A.9 How to Scale Numeric Data Columns to Vary Between 0 and 1 549

A.10 Random Split of a Matrix into a Training and Test Set 550

A.11 Removal of NaN Values from a Matrix 550

A.12 How to Calculate the Percent of Truly Judged Class Type Cases for a Binary Class Response 550

A.13 Error Function m-file 551

A.14 Conversion of Categorical into Numeric Dummy Matrix 552

A.15 evaluateFit2 Function 553

A.16 showActivationsForChannel Function 554

A.17 upsampLowRes Function 555

A.18A preprocessData function 555

A.18B preprocessData2 function 555

A.19 processTurboFanDataTrain function 556

A.20 processTurboFanDataTest Function 556

A.21 preprocessText Function 557

A.22 documentGenerationDatastore Function 557

A.23 subset Function for an Image Data Store Partition 560

Index 561

Erscheinungsdatum
Verlagsort New York
Sprache englisch
Gewicht 1370 g
Themenwelt Informatik Theorie / Studium Künstliche Intelligenz / Robotik
Naturwissenschaften Chemie
Wirtschaft
ISBN-10 1-394-20908-8 / 1394209088
ISBN-13 978-1-394-20908-8 / 9781394209088
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich
Eine kurze Geschichte der Informationsnetzwerke von der Steinzeit bis …

von Yuval Noah Harari

Buch | Hardcover (2024)
Penguin (Verlag)
CHF 39,20