Linked e-resources

Details

Intro
Table of Contents
About the Author
About the Contributing Author
About the Technical Reviewer
Acknowledgments
Foreword
Introduction
Chapter 1: Optimization and Neural Networks
A Basic Understanding of Neural Networks
The Problem of Learning
A First Definition of Learning
[Advanced Section] Assumption in the Formulation
A Definition of Learning for Neural Networks
Constrained vs. Unconstrained Optimization
[Advanced Section] Reducing a Constrained Problem to an Unconstrained Optimization Problem
Absolute and Local Minima of a Function

Optimization Algorithms
Line Search and Trust Region
Steepest Descent
The Gradient Descent Algorithm
Choosing the Right Learning Rate
Variations of GD
Mini-Batch GD
Stochastic GD
How to Choose the Right Mini-Batch Size
[Advanced Section] SGD and Fractals
Exercises
Conclusion
Chapter 2: Hands-on with a Single Neuron
A Short Overview of a Neuron's Structure
A Short Introduction to Matrix Notation
An Overview of the Most Common Activation Functions
Identity Function
Sigmoid Function
Tanh (Hyperbolic Tangent) Activation Function

ReLU (Rectified Linear Unit) Activation Function
Leaky ReLU
The Swish Activation Function
Other Activation Functions
How to Implement a Neuron in Keras
Python Implementation Tips: Loops and NumPy
Linear Regression with a Single Neuron
The Dataset for the Real-World Example
Dataset Splitting
Linear Regression Model
Keras Implementation
The Model's Learning Phase
Model's Performance Evaluation on Unseen Data
Logistic Regression with a Single Neuron
The Dataset for the Classification Problem
Dataset Splitting
The Logistic Regression Model

Keras Implementation
The Model's Learning Phase
The Model's Performance Evaluation
Conclusion
Exercises
References
Chapter 3: Feed-Forward Neural Networks
A Short Review of Network's Architecture and Matrix Notation
Output of Neurons
A Short Summary of Matrix Dimensions
Example: Equations for a Network with Three Layers
Hyper-Parameters in Fully Connected Networks
A Short Review of the Softmax Activation Function for Multiclass Classifications
A Brief Digression: Overfitting
A Practical Example of Overfitting
Basic Error Analysis

Implementing a Feed-Forward Neural Network in Keras
Multiclass Classification with Feed-Forward Neural Networks
The Zalando Dataset for the Real-World Example
Modifying Labels for the Softmax Function: One-Hot Encoding
The Feed-Forward Network Model
Keras Implementation
Gradient Descent Variations Performances
Comparing the Variations
Examples of Wrong Predictions
Weight Initialization
Adding Many Layers Efficiently
Advantages of Additional Hidden Layers
Comparing Different Networks
Tips for Choosing the Right Network

Browse Subjects

Show more subjects...

Statistics

from
to
Export