Linked e-resources
Details
Table of Contents
Intro; Table of Contents; About the Author; About the Technical Reviewer; Acknowledgments; Introduction; Chapter 1: Introduction and Development Environment Setup; GitHub Repository and Companion Website; Mathematical Level Required; Python Development Environment; Google Colab; Benefits and Drawbacks to Google Colab; Anaconda; Installing TensorFlow the Anaconda Way; Local Jupyter Notebooks; Benefits and Drawbacks to Anaconda; Docker Image; Benefits and Drawbacks to a Docker Image; Which Option Should You Choose?; Chapter 2: TensorFlow: Advanced Topics; Tensorflow Eager Execution
Enabling Eager ExecutionPolynomial Fitting with Eager Execution; MNIST Classification with Eager Execution; TensorFlow and Numpy Compatibility; Hardware Acceleration; Checking the Availability of the GPU; Device Names; Explicit Device Placement; GPU Acceleration Demonstration: Matrix Multiplication; Effect of GPU Acceleration on the MNIST Example; Training Only Specific Layers; Training Only Specific Layers: An Example; Removing Layers; Keras Callback Functions; Custom Callback Class; Example of a Custom Callback Class; Save and Load Models; Save Your Weights Manually; Saving the Entire Model
Dataset AbstractionIterating Over a Dataset; Simple Batching; Simple Batching with the MNIST Dataset; Using tf.data.Dataset in Eager Execution Mode; Conclusions; Chapter 3: Fundamentals of Convolutional Neural Networks; Kernels and Filters; Convolution; Examples of Convolution; Pooling; Padding; Building Blocks of a CNN; Convolutional Layers; Pooling Layers; Stacking Layers Together; Number of Weights in a CNN; Convolutional Layer; Pooling Layer; Dense Layer; Example of a CNN: MNIST Dataset; Visualization of CNN Learning; Brief Digression: keras.backend.function(); Effect of Kernels
Effect of Max-PoolingChapter 4: Advanced CNNs and Transfer Learning; Convolution with Multiple Channels; History and Basics of Inception Networks; Inception Module: Naïve Version; Number of Parameters in the Naïve Inception Module; Inception Module with Dimension Reduction; Multiple Cost Functions: GoogLeNet; Example of Inception Modules in Keras; Digression: Custom Losses in Keras; How To Use Pre-Trained Networks; Transfer Learning: An Introduction; A Dog and Cat Problem; Classical Approach to Transfer Learning; Experimentation with Transfer Learning
Chapter 5: Cost Functions and Style TransferComponents of a Neural Network Model; Training Seen as an Optimization Problem; A Concrete Example: Linear Regression; The Cost Function; Mathematical Notation; Typical Cost Functions; Mean Square Error; Intuitive Explanation; MSE as the Second Moment of a Moment-Generating Function; Cross-Entropy; Self-Information or Suprisal of an Event; Suprisal Associated with an Event X; Cross-Entropy; Cross-Entropy for Binary Classification; Cost Functions: A Final Word; Neural Style Transfer; The Mathematics Behind NST; An Example of Style Transfer in Keras
Enabling Eager ExecutionPolynomial Fitting with Eager Execution; MNIST Classification with Eager Execution; TensorFlow and Numpy Compatibility; Hardware Acceleration; Checking the Availability of the GPU; Device Names; Explicit Device Placement; GPU Acceleration Demonstration: Matrix Multiplication; Effect of GPU Acceleration on the MNIST Example; Training Only Specific Layers; Training Only Specific Layers: An Example; Removing Layers; Keras Callback Functions; Custom Callback Class; Example of a Custom Callback Class; Save and Load Models; Save Your Weights Manually; Saving the Entire Model
Dataset AbstractionIterating Over a Dataset; Simple Batching; Simple Batching with the MNIST Dataset; Using tf.data.Dataset in Eager Execution Mode; Conclusions; Chapter 3: Fundamentals of Convolutional Neural Networks; Kernels and Filters; Convolution; Examples of Convolution; Pooling; Padding; Building Blocks of a CNN; Convolutional Layers; Pooling Layers; Stacking Layers Together; Number of Weights in a CNN; Convolutional Layer; Pooling Layer; Dense Layer; Example of a CNN: MNIST Dataset; Visualization of CNN Learning; Brief Digression: keras.backend.function(); Effect of Kernels
Effect of Max-PoolingChapter 4: Advanced CNNs and Transfer Learning; Convolution with Multiple Channels; History and Basics of Inception Networks; Inception Module: Naïve Version; Number of Parameters in the Naïve Inception Module; Inception Module with Dimension Reduction; Multiple Cost Functions: GoogLeNet; Example of Inception Modules in Keras; Digression: Custom Losses in Keras; How To Use Pre-Trained Networks; Transfer Learning: An Introduction; A Dog and Cat Problem; Classical Approach to Transfer Learning; Experimentation with Transfer Learning
Chapter 5: Cost Functions and Style TransferComponents of a Neural Network Model; Training Seen as an Optimization Problem; A Concrete Example: Linear Regression; The Cost Function; Mathematical Notation; Typical Cost Functions; Mean Square Error; Intuitive Explanation; MSE as the Second Moment of a Moment-Generating Function; Cross-Entropy; Self-Information or Suprisal of an Event; Suprisal Associated with an Event X; Cross-Entropy; Cross-Entropy for Binary Classification; Cost Functions: A Final Word; Neural Style Transfer; The Mathematics Behind NST; An Example of Style Transfer in Keras