Go to main content
Formats
Format
BibTeX
MARCXML
TextMARC
MARC
DublinCore
EndNote
NLM
RefWorks
RIS
Cite
Citation

Linked e-resources

Details

I Introduction
I.1 Who should read this book?
I.2 Book organization
II Health Data
II. 1 The growth of EHR Adoption
II. 2 Health Data
II. 2.1 Life cycle of health data
II. 2.2 Structured Health Data
II. 2.3 Unstructured clinical notes
II. 2.4 Continuous signals
II. 2.5 Medical Imaging Data
II. 2.6 Biomedical data for in silico drug Discovery
II. 3 Health Data Standards
III Machine Learning Basics
III. 1 Supervised Learning
III. 1.1 Logistic Regression
III. 1.2 Softmax Regression
III. 1.3 Gradient Descent
III. 1.4 Stochastic and Minibatch Gradient Descent
III. 2 Unsupervised Learning
III. 2.1 Principal component analysis
III. 2.2 t-distributed stochastic neighbor embedding (t-SNE)
III. 2.3 Clustering
III. 3 Assessing Model Performance
III. 3.1 Evaluation Metrics for Regression Tasks
III. 3.2 Evaluation Metrics for Classification Tasks
III. 3.3 Evaluation Metrics for Clustering Tasks
III. 3.4 Evaluation Strategy
III. 4 Modeling Exercise
III. 5 Hands-On Practice
3
4 CONTENTS
IVDeep Neural Networks (DNN)
IV. 1 A Single neuron
IV. 1.1 Activation function
IV. 1.2 Loss Function
IV. 1.3 Train a single neuron
IV. 2 Multilayer Neural Network
IV. 2.1 Network Representation
IV. 2.2 Train a Multilayer Neural Network
IV. 2.3 Summary of the Backpropagation Algorithm
IV. 2.4 Parameters and Hyper-parameters
IV. 3 Readmission Prediction from EHR Data with DNN
IV. 4 DNN for Drug Property Prediction
V Embedding
V.1 Overview
V.2 Word2Vec
V.2.1 Idea and Formulation of Word2Vec
V.2.2 Healthcare application of Word2Vec
V.3 Med2Vec: two-level embedding for EHR
V.3.1 Med2Vec Method
V.4 MiME: Embed Internal Structure
V.4.1 Notations of MIME
V.4.2 Description of MIME
V.4.3 Experiment results of MIME
VI Convolutional Neural Networks (CNN)
VI. 1 CNN intuition
VI. 2 Architecture of CNN
VI. 2.1 Convolution layer
1D
VI. 2.2 Convolution layer
2D
VI. 2.3 Pooling Layer
VI. 2.4 Fully Connected Layer
VI. 3 Backpropagation Algorithm in CNN*
VI. 3.1 Forward and Backward Computation for 1-D Data
VI. 3.2 Forward Computation and Backpropagation for 2-D Convolution
Layer .
VI. 3.3 Special CNN Architecture
VI. 4 Healthcare Applications
VI. 5 Automated surveillance of cranial images for acute neurologic events
VI. 6 Detection of Lymph Node Metastases from Pathology Images
VI. 7 Cardiologist-level arrhythmia detection and classification in ambulatory
ECG
CONTENTS 5
VIIRecurrent Neural Networks (RNN)
VII. 1Basic Concepts and Notations
VII. 2Backpropagation Through Time (BPTT) algorithm
VII. 2.1Forward Pass
VII. 2.2 Backward Pass
VII. 3RNN Variants
VII. 3.1 Long Short-Term Memory (LSTM)
VII. 3.2 Gated Recurrent Unit (GRU)
VII. 3.3 Bidirectional RNN
VII. 3.4 Encoder-Decoder Sequence-to-Sequence Models
VII. 4Case Study: Early detection of heart failure
VII. 5Case Study: Sequential clinical event prediction
VII. 6Case Study: De-identification of Clinical Notes
VII. 7Case Study: Automatic Detection of Heart Disease from electrocardiography
(ECG) Data
VIIAIutoencoders (AE)
VIII. 1Overview
VIII. 2Autoencoders
VIII. 3Sparse Autoencoders
VIII. 4Stacked Autoencoders
VIII. 5Denoising Autoencoders
VIII. 6Case Study: Deep Patient via stacked denoising autoencoders
VIII. 7Case Study: Learning from Noisy, Sparse, and Irregular Clinical
data
IX Attention Models
IX. 1 Overview
IX. 2 Attention Mechanism
IX. 2.1 Attention based on Encoder-Decoder RNN Models
IX. 2.2 Case Study: Attention Model over Longitudinal EHR
IX. 2.3 Case Study: Attention model over a Medical Ontology
IX. 2.4 Case Study: ICD Classification from Clinical Notes
X Memory Networks
X.1 Original Memory Networks
X.2 End-to-end Memory Networks
X.3 Case Study: Medication Recommendation
X.4 EEG-RelNet: Memory Derived from Data
X.5 Incorporate Memory from Unstructured Knowledge Base
XIGraph Neural Networks
XI. 1 Overview
XI. 2 Graph Convolutional Networks
XI. 2.1 Basic Setting of GCN
XI. 2.2 Spatial Convolution on Graphs
6 CONTENTS
XI. 2.3 Spectral Convolution on Graphs
XI. 2.4 Approximate Graph Convolution
XI. 2.5 Neighborhood Aggregation
XI. 3 Neural Fingerprinting: Drug Molecule Embedding with GCN
XI. 4 Decagon: Modeling Polypharmacy Side Effects with GCN
XI. 5 Case Study: Multiview Drug-drug Interaction Prediction
XIIGenerative Models
XII. 1Generative adversarial networks (GAN)
XII. 1.1 The GAN Framework
XII. 1.2 The Cost Function of Discriminator
XII. 1.3 The Cost Function of Generator
XII. 2Variational Autoencoders (VAE)
XII. 2.1 Latent Variable Models
XII. 2.2Objective Formulation
XII. 2.3Objective Approximation
XII. 2.4 Reparameterization Trick
XII. 3Case Study: Generating Patient Records
XII. 4Case Study: Small Molecule Generation for Drug Discovery
XII CIonclusion
XIII. 1Model Setup
XIII. 2Model Training
XIII. 3Testing and Performance Evaluation
XIII. 4Result Visualization
XIII. 5Case Studies
XIVAppendix
XIV. 1Regularization*
XIV. 1.1Vanishing or Exploding Gradient Problem
XIV. 1.2Dropout
XIV. 1.3Batch normalization
XIV. 2Stochastic Gradient Descent and Minibatch gradient descent*
XIV. 3Advanced optimization*
XIV. 3.1Momentum
XIV. 3.2Adagrad, Adadelta, and RMSprop
XIV. 3.3Adam.

Browse Subjects

Show more subjects...

Statistics

from
to
Export