Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Artificial Neural Networks with Java - Igor Livshin

Artificial Neural Networks with Java

Tools for Building Neural Network Applications

(Autor)

Buch | Softcover
566 Seiten
2019 | 1st ed.
Apress (Verlag)
978-1-4842-4420-3 (ISBN)
CHF 67,35 inkl. MwSt
zur Neuauflage
  • Titel erscheint in neuer Auflage
  • Artikel merken
Zu diesem Artikel existiert eine Nachauflage
Use Java to develop neural network applications in this practical book. After learning the rules involved in neural network processing, you will manually process the first neural network example. This covers the internals of front and back propagation, and facilitates the understanding of the main principles of neural network processing. Artificial Neural Networks with Java also teaches you how to prepare the data to be used in neural network development and suggests various techniques of data preparation for many unconventional tasks. 
The next big topic discussed in the book is using Java for neural network processing. You will use the Encog Java framework and discover how to do rapid development with Encog, allowing you to create large-scale neural network applications.
The book also discusses the inability of neural networks to approximate complex non-continuous functions, and it introduces the micro-batch method that solves this issue. The step-by-step approach includes plenty of examples, diagrams, and screen shots to help you grasp the concepts quickly and easily.

What You Will Learn

Prepare your data for many different tasks

Carry out some unusual neural network tasks

Create neural network to process non-continuous functions

Select and improve the development model  



Who This Book Is For
Intermediate machine learning and deep learning developers who are interested in switching to Java.

Igor Livshin is a senior architect with extensive experience in developing large-scale applications. He worked for many years for two large insurance companies: CNN and Blue Cross & Blue Shield of Illinois. He currently works as a senior researcher at DevTechnologies specializing in AI and neural networks. Igor has a master’s degree in computer science from the Institute of Technology in Odessa, Russia/Ukraine.   

Part One. Getting Started with Neural NetworksChapter 1.  Learning Neural Network 
Biological and Artificial Neurons Activation Functions Summary
Chapter 2.  Internal Mechanism of Neural Network Processing
Function to be ApproximatedNetwork Architecture Forward Pass Calculations Back-Propagation Pass CalculationsFunction derivative and function divergent Table of Most Commonly Used Function DerivativesSummary 

Chapter 3.  Manual Neural Network Processing 
Example 1. Manual Approximation of a Function at a Single Point  Building the Neural Network Forward pass calculation Backward Pass Calculation     Calculating Weight Adjustments for the Output Layer Neurons     Calculating Weight Adjustments for the Hidden Layer Neurons    Updating  Network Biases Back to the Forward PassMatrix Form of Network CalculationDigging Deeper Mini-Batches and Stochastic Gradient Summary

Part Two. Neural Network Java Development Environment Chapter 4.  Configuring Your Development Environment 
Installing Java 8 Environment on Your Windows MachineInstalling NetBeans IDEInstalling Encog Java Framework Installing XChart Package Summary
Chapter 5.  Neural Network Development Using Java EncogFramework 
Example 2. Function Approximation using Java environmentNetwork Architecture Normalizing the Input datasets Building the Java Program that Normalizes Both DatasetsProgram Code Debugging and Executing the Program Processing Results for the Training Method Testing the Network Testing Results Digging deeper.Summary 
Part Three. Development Non-Trivial Neural Network ApplicationsChapter 6.  Neural Network Prediction Outside of the Training Range Example 3a. Approximating Periodic Functions Outside of the Training RangeNetwork Architecture for Example 3aProgram Code for Example 3aTesting The NetworkExample 3b. Correct Way of Approximating Periodic Functions Outside of the Training RangePreparing the Training DataNetwork Architecture for the Example 3bProgram Code for Example 3bTraining Results for Example 3bTesting Results for Example 3b Summary 
Chapter 7.  Processing Complex Periodic FunctionsExample 4. Approximation of a Complex Periodic FunctionData Preparation Reflecting Function Topology in DataNetwork Architecture Program CodeTesting the Network Digging DeeperSummary 
Chapter 8.  Approximating Non-Continuous Functions Example 5. Approximating Non-Continuous FunctionsApproximating Non-Continuous Function Using Conventional Network Process . . . . . . .Network ArchitectureProgram CodeCode Fragments for the Training ProcessUnsatisfactory Training ResultsApproximating the Non-Continuous Function Using Micro-Bach MethodProgram Code for Micro-Batch processingProgram Code for the getChart() MethodCode Fragment 1 of the Training MethodCode Fragment 2 of the Training MethodTraining Results for Micro-Batch methodTest Processing LogicTesting Results for Micro-Batch methodDigging DeeperSummary 
Chapter 9. Approximation Continuous Functions with Complex TopologyExample 5a. Approximation of Continuous Function with Complex Topology Network Architecture for Example 5aProgram Code for Example 5aTraining Processing Results for Example 5aApproximation of Continuous Function with Complex Topology Using  Micro-Batch Method Program Code for Example 5a Using Micro-Batch MethodExample 5b. Approximation of Spiral-Like Functions Network Architecture for Example 5bProgram Code for Example 5bApproximation of the Same Functions Using Micro-Batch MethodSummary 

Chapter 10.  Using Neural Network for Classification of Objects
Example 6. Classification of records Training Dataset Network Architecture Testing Dataset Program Code for Data NormalizationProgram Code for Classification Training ResultsTesting Results Summary Chapter 11.  Importance of Selecting a Correct Model
Example 7.  Predicting Next Month Stock Market Price. .  Data PreparationIncluding Function Topology in the Dataset Building Micro-Batch FilesNetwork ArchitectureProgram Code Training Process Training Results.Testing ProcessTest Processing LogicTesting ResultsAnalyzing Testing Results Summary Chapter 12. Approximation of Functions in 3-D Space Example 8.  Approximation of Functions in 3-D Space Data Preparation Network ArchitectureProgram Code Processing Results Summary

Erscheinungsdatum
Zusatzinfo 95 Illustrations, black and white; XIX, 566 p. 95 illus.
Verlagsort Berkley
Sprache englisch
Maße 178 x 254 mm
Gewicht 1104 g
Themenwelt Informatik Programmiersprachen / -werkzeuge Java
Informatik Theorie / Studium Künstliche Intelligenz / Robotik
Schlagworte AI • Artificial Intelligence • Code • computing • Data Preparation • Deep learning • Encog • Java • Methodology • Neural Network Architecture • Neural Network Processing • Neural networks • programming • source
ISBN-10 1-4842-4420-6 / 1484244206
ISBN-13 978-1-4842-4420-3 / 9781484244203
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich
mit über 150 Workouts in Java und Python

von Luigi Lo Iacono; Stephan Wiefling; Michael Schneider

Buch (2023)
Carl Hanser (Verlag)
CHF 41,95
Einführung, Ausbildung, Praxis

von Christian Ullenboom

Buch | Hardcover (2023)
Rheinwerk (Verlag)
CHF 69,85