001445676 000__ 06017cam\a2200589Ii\4500 001445676 001__ 1445676 001445676 003__ OCoLC 001445676 005__ 20230310003844.0 001445676 006__ m\\\\\o\\d\\\\\\\\ 001445676 007__ cr\cn\nnnunnun 001445676 008__ 220405s2022\\\\nyua\\\\ob\\\\001\0\eng\d 001445676 019__ $$a1306528775$$a1308394592$$a1308796002$$a1312916752 001445676 020__ $$a9781484280201$$q(electronic bk.) 001445676 020__ $$a1484280202$$q(electronic bk.) 001445676 020__ $$z9781484280195 001445676 020__ $$z1484280199 001445676 0247_ $$a10.1007/978-1-4842-8020-1$$2doi 001445676 035__ $$aSP(OCoLC)1308983931 001445676 040__ $$aORMDA$$beng$$erda$$epn$$cORMDA$$dEBLCP$$dYDX$$dOCLCO$$dGZM$$dGW5XE$$dN$T$$dOCLCF$$dUKAHL$$dOCLCQ 001445676 049__ $$aISEA 001445676 050_4 $$aQA76.73.P98$$bM53 2022 001445676 08204 $$a006.31$$223 001445676 1001_ $$aMichelucci, Umberto,$$eauthor. 001445676 24510 $$aApplied deep learning with TensorFlow 2 :$$blearn to implement advanced deep learning techniques with Python /$$cUmberto Michelucci. 001445676 250__ $$a2nd ed. 001445676 264_1 $$aNew York, NY :$$bApress,$$c[2022] 001445676 300__ $$a1 online resource (397 pages) :$$billustrations 001445676 336__ $$atext$$btxt$$2rdacontent 001445676 337__ $$acomputer$$bc$$2rdamedia 001445676 338__ $$aonline resource$$bcr$$2rdacarrier 001445676 504__ $$aIncludes bibliographical references and index. 001445676 5050_ $$aIntro -- Table of Contents -- About the Author -- About the Contributing Author -- About the Technical Reviewer -- Acknowledgments -- Foreword -- Introduction -- Chapter 1: Optimization and Neural Networks -- A Basic Understanding of Neural Networks -- The Problem of Learning -- A First Definition of Learning -- [Advanced Section] Assumption in the Formulation -- A Definition of Learning for Neural Networks -- Constrained vs. Unconstrained Optimization -- [Advanced Section] Reducing a Constrained Problem to an Unconstrained Optimization Problem -- Absolute and Local Minima of a Function 001445676 5058_ $$aOptimization Algorithms -- Line Search and Trust Region -- Steepest Descent -- The Gradient Descent Algorithm -- Choosing the Right Learning Rate -- Variations of GD -- Mini-Batch GD -- Stochastic GD -- How to Choose the Right Mini-Batch Size -- [Advanced Section] SGD and Fractals -- Exercises -- Conclusion -- Chapter 2: Hands-on with a Single Neuron -- A Short Overview of a Neuron's Structure -- A Short Introduction to Matrix Notation -- An Overview of the Most Common Activation Functions -- Identity Function -- Sigmoid Function -- Tanh (Hyperbolic Tangent) Activation Function 001445676 5058_ $$aReLU (Rectified Linear Unit) Activation Function -- Leaky ReLU -- The Swish Activation Function -- Other Activation Functions -- How to Implement a Neuron in Keras -- Python Implementation Tips: Loops and NumPy -- Linear Regression with a Single Neuron -- The Dataset for the Real-World Example -- Dataset Splitting -- Linear Regression Model -- Keras Implementation -- The Model's Learning Phase -- Model's Performance Evaluation on Unseen Data -- Logistic Regression with a Single Neuron -- The Dataset for the Classification Problem -- Dataset Splitting -- The Logistic Regression Model 001445676 5058_ $$aKeras Implementation -- The Model's Learning Phase -- The Model's Performance Evaluation -- Conclusion -- Exercises -- References -- Chapter 3: Feed-Forward Neural Networks -- A Short Review of Network's Architecture and Matrix Notation -- Output of Neurons -- A Short Summary of Matrix Dimensions -- Example: Equations for a Network with Three Layers -- Hyper-Parameters in Fully Connected Networks -- A Short Review of the Softmax Activation Function for Multiclass Classifications -- A Brief Digression: Overfitting -- A Practical Example of Overfitting -- Basic Error Analysis 001445676 5058_ $$aImplementing a Feed-Forward Neural Network in Keras -- Multiclass Classification with Feed-Forward Neural Networks -- The Zalando Dataset for the Real-World Example -- Modifying Labels for the Softmax Function: One-Hot Encoding -- The Feed-Forward Network Model -- Keras Implementation -- Gradient Descent Variations Performances -- Comparing the Variations -- Examples of Wrong Predictions -- Weight Initialization -- Adding Many Layers Efficiently -- Advantages of Additional Hidden Layers -- Comparing Different Networks -- Tips for Choosing the Right Network 001445676 506__ $$aAccess limited to authorized users. 001445676 520__ $$aUnderstand how neural networks work and learn how to implement them using TensorFlow 2.0 and Keras. This new edition focuses on the fundamental concepts and at the same time on practical aspects of implementing neural networks and deep learning for your research projects. This book is designed so that you can focus on the parts you are interested in. You will explore topics as regularization, optimizers, optimization, metric analysis, and hyper-parameter tuning. In addition, you will learn the fundamentals ideas behind autoencoders and generative adversarial networks. All the code presented in the book will be available in the form of Jupyter notebooks which would allow you to try out all examples and extend them in interesting ways. A companion online book is available with the complete code for all examples discussed in the book and additional material more related to TensorFlow and Keras. All the code will be available in Jupyter notebook format and can be opened directly in Google Colab (no need to install anything locally) or downloaded on your own machine and tested locally. 001445676 650_0 $$aPython (Computer program language) 001445676 650_0 $$aMachine learning. 001445676 650_0 $$aNeural networks (Computer science) 001445676 650_6 $$aPython (Langage de programmation) 001445676 650_6 $$aApprentissage automatique. 001445676 650_6 $$aRéseaux neuronaux (Informatique) 001445676 655_0 $$aElectronic books. 001445676 77608 $$iPrint version: $$z1484280199$$z9781484280195$$w(OCoLC)1289363782 001445676 852__ $$bebk 001445676 85640 $$3Springer Nature$$uhttps://univsouthin.idm.oclc.org/login?url=https://link.springer.com/10.1007/978-1-4842-8020-1$$zOnline Access$$91397441.1 001445676 909CO $$ooai:library.usi.edu:1445676$$pGLOBAL_SET 001445676 980__ $$aBIB 001445676 980__ $$aEBOOK 001445676 982__ $$aEbook 001445676 983__ $$aOnline 001445676 994__ $$a92$$bISE