Applied deep learning : a case-based approach to understanding deep neural networks /
Saved in:
Author / Creator: | Michelucci, Umberto, author. |
---|---|
Imprint: | [United States] : Apress, 2018. New York, NY : Distributed to the Book trade worldwide by Springer ©2018 |
Description: | 1 online resource |
Language: | English |
Subject: | |
Format: | E-Resource Book |
URL for this record: | http://pi.lib.uchicago.edu/1001/cat/bib/11705969 |
MARC
LEADER | 00000cam a2200000Ii 4500 | ||
---|---|---|---|
001 | 11705969 | ||
006 | m o d | ||
007 | cr cnu|||unuuu | ||
008 | 180914s2018 xxu o 001 0 eng d | ||
005 | 20240509200316.6 | ||
016 | 7 | |a 019052491 |2 Uk | |
019 | |a 1052874565 |a 1060592622 |a 1081251957 |a 1086560493 |a 1103257493 |a 1105178023 |a 1105710161 | ||
020 | |a 9781484237908 |q (electronic bk.) | ||
020 | |a 1484237900 |q (electronic bk.) | ||
020 | |a 9781484237915 |q (print) | ||
020 | |a 1484237919 | ||
020 | |z 9781484237892 |q (print) | ||
020 | |z 1484237897 | ||
024 | 7 | |a 10.1007/978-1-4842-3790-8 |2 doi | |
024 | 8 | |a 10.1007/978-1-4842-3 | |
027 | |a SPRINTER | ||
035 | |a (OCoLC)1052566493 |z (OCoLC)1052874565 |z (OCoLC)1060592622 |z (OCoLC)1081251957 |z (OCoLC)1086560493 |z (OCoLC)1103257493 |z (OCoLC)1105178023 |z (OCoLC)1105710161 | ||
035 | 9 | |a (OCLCCM-CC)1052566493 | |
037 | |a com.springer.onix.9781484237908 |b Springer Nature | ||
040 | |a N$T |b eng |e rda |e pn |c N$T |d N$T |d GW5XE |d EBLCP |d NLE |d YDX |d OCLCF |d MOQ |d UAB |d UPM |d UKMGB |d OTZ |d LVT |d OCLCQ |d U3W |d VT2 |d CAUOI |d LEAUB |d MERER |d COO |d UKAHL |d LQU |d FVL |d OCLCQ |d SRU | ||
049 | |a MAIN | ||
050 | 4 | |a Q325.5 | |
072 | 7 | |a COM |x 000000 |2 bisacsh | |
072 | 7 | |a UMA |2 bicssc | |
072 | 7 | |a UMA |2 thema | |
100 | 1 | |a Michelucci, Umberto, |e author. | |
245 | 1 | 0 | |a Applied deep learning : |b a case-based approach to understanding deep neural networks / |c Umberto Michelucci. |
264 | 1 | |a [United States] : |b Apress, |c 2018. | |
264 | 2 | |a New York, NY : |b Distributed to the Book trade worldwide by Springer | |
264 | 4 | |c ©2018 | |
300 | |a 1 online resource | ||
336 | |a text |b txt |2 rdacontent | ||
337 | |a computer |b c |2 rdamedia | ||
338 | |a online resource |b cr |2 rdacarrier | ||
347 | |a text file |b PDF |2 rda | ||
500 | |a Includes index. | ||
588 | 0 | |a Online resource; title from PDF title page (Ebsco, viewed September 17, 2018). | |
505 | 0 | |a Intro; Table of Contents; About the Author; About the Technical Reviewer; Acknowledgments; Introduction; Chapter 1: Computational Graphs and TensorFlow; How to Set Up Your Python Environment; Creating an Environment; Installing TensorFlow; Jupyter Notebooks; Basic Introduction to TensorFlow; Computational Graphs; Tensors; Creating and Running a Computational Graph; Computational Graph with tf.constant; Computational Graph with tf. Variable; Computational Graph with tf.placeholder; Differences Between run and eval; Dependencies Between Nodes; Tips on How to Create and Close a Session. | |
505 | 8 | |a Chapter 2: Single NeuronThe Structure of a Neuron; Matrix Notation; Python Implementation Tip: Loops and NumPy; Activation Functions; Identity Function; Sigmoid Function; Tanh (Hyperbolic Tangent Activation) Function; ReLU (Rectified Linear Unit) Activation Function; Leaky ReLU; Swish Activation Function; Other Activation Functions; Cost Function and Gradient Descent: The Quirks of the Learning Rate; Learning Rate in a Practical Example; Example of Linear Regression in tensorflow; Dataset for Our Linear Regression Model; Neuron and Cost Function for Linear Regression. | |
505 | 8 | |a Satisficing and Optimizing a MetricExample of Logistic Regression; Cost Function; Activation Function; The Dataset; tensorflow Implementation; References; Chapter 3: Feedforward Neural Networks; Network Architecture; Output of Neurons; Summary of Matrix Dimensions; Example: Equations for a Network with Three Layers; Hyperparameters in Fully Connected Networks; sof tmax Function for Multiclass Classification; A Brief Digression: Overfitting; A Practical Example of Overfitting; Basic Error Analysis; The Zalando Dataset; Building a Model with tensorflow; Network Architecture. | |
505 | 8 | |a Modifying Labels for the softmax Function-One-Hot EncodingThe tensor flow Model; Gradient Descent Variations; Batch Gradient Descent; Stochastic Gradient Descent; Mini-Batch Gradient Descent; Comparison of the Variations; Examples of Wrong Predictions; Weight Initialization; Adding Many Layers Efficiently; Advantages of Additional Hidden Layers; Comparing Different Networks; Tips for Choosing the Right Network; Chapter 4: Training Neural Networks; Dynamic Learning Rate Decay; Iterations or Epochs?; Staircase Decay; Step Decay; Inverse Time Decay; Exponential Decay; Natural Exponential Decay. | |
505 | 8 | |a Tensorflow ImplementationApplying the Methods to the Zalando Dataset; Common Optimizers; Exponentially Weighted Averages; Momentum; RMSProp; Adam; Which Optimizer Should I Use?; Example of Self-Developed Optimizer; Chapter 5: Regularization; Complex Networks and Overfitting; What Is Regularization?; About Network Complexity; ℓp Norm; ℓ2 Regularization; Theory of ℓ2 Regularization; tensorflow Implementation; ℓ1 Regularization; Theory of ℓ1 Regularization and tensorflow Implementation; Are Weights Really Going to Zero?; Dropout; Early Stopping; Additional Methods; Chapter 6: Metric Analysis. | |
520 | |a Work with advanced topics in deep learning, such as optimization algorithms, hyper-parameter tuning, dropout, and error analysis as well as strategies to address typical problems encountered when training deep neural networks. You'll begin by studying the activation functions mostly with a single neuron (ReLu, sigmoid, and Swish), seeing how to perform linear and logistic regression using TensorFlow, and choosing the right cost function. The next section talks about more complicated neural network architectures with several layers and neurons and explores the problem of random initialization of weights. An entire chapter is dedicated to a complete overview of neural network error analysis, giving examples of solving problems originating from variance, bias, overfitting, and datasets coming from different distributions. Applied Deep Learning also discusses how to implement logistic regression completely from scratch without using any Python library except NumPy, to let you appreciate how libraries such as TensorFlow allow quick and efficient experiments. Case studies for each method are included to put into practice all theoretical information. You'll discover tips and tricks for writing optimized Python code (for example vectorizing loops with NumPy). What You Will Learn Implement advanced techniques in the right way in Python and TensorFlow Debug and optimize advanced methods (such as dropout and regularization) Carry out error analysis (to realize if one has a bias problem, a variance problem, a data offset problem, and so on) Set up a machine learning project focused on deep learning on a complex dataset Who This Book Is For Readers with a medium understanding of machine learning, linear algebra, calculus, and basic Python programming. | ||
650 | 0 | |a Machine learning. |0 http://id.loc.gov/authorities/subjects/sh85079324 | |
650 | 0 | |a Neural networks (Computer science) |0 http://id.loc.gov/authorities/subjects/sh90001937 | |
650 | 7 | |a Programming & scripting languages: general. |2 bicssc | |
650 | 7 | |a Computer programming |x software development. |2 bicssc | |
650 | 7 | |a Databases. |2 bicssc | |
650 | 7 | |a Program concepts |x learning to program. |2 bicssc | |
650 | 7 | |a COMPUTERS |x General. |2 bisacsh | |
650 | 7 | |a Machine learning. |2 fast |0 (OCoLC)fst01004795 | |
650 | 7 | |a Neural networks (Computer science) |2 fast |0 (OCoLC)fst01036260 | |
655 | 0 | |a Electronic books. | |
655 | 4 | |a Electronic books. | |
776 | 0 | 8 | |i Print version: |a Michelucci, Umberto. |t Applied deep learning. |d [United States] : Apress, 2018 |z 1484237897 |z 9781484237892 |w (OCoLC)1037808590 |
903 | |a HeVa | ||
929 | |a oclccm | ||
999 | f | f | |i dda2c180-db04-59e1-9318-ea5998d65541 |s 5d91e3fc-96a1-52d7-bb4f-1a1cba91053f |
928 | |t Library of Congress classification |a Q325.5 |l Online |c UC-FullText |u https://link.springer.com/10.1007/978-1-4842-3790-8 |z Springer Nature |g ebooks |i 12556769 |