所需积分/C币:50 2017-07-11 22:34:54 14.54MB PDF
收藏 收藏

3. Who Should Read This Book? 1. The Enterprise Machine Learning Practitioner 2. The Enterprise Executive 3. The academic 4. Conventions used in This book 5. USing Code examples 6. Administrative notes 7. O'Reilly safari 8. How to contact Us 9. Acknowledgements 1. Josh's Acknowledgements 2. Adams Acknowledgements 2. 1. A Review of Machine Learning 1. The learning machines 1. How Can machines learn? 2. Biological Inspiration 3. What is Deep Learning? 4. Going down the rabbit hole 2. Framing the questions 3. The Math behind Machine Learning: Linear Algebra Scalars 2. Vectors 3. Matrices 4. Tensors 5. Hyperplanes 6. Relevant Mathematical Operations 7. Converting Data Into Vectors 8. Solving Systems of Equations 4. The Math Behind Machine Learning: Statistics 1. Probability 2. Conditional probabilities 3. Posterior probability 4. Distributions 5. Samples vs population 6. Resampling methods 7. Selection bias 8. Likelihood 5. How Does Machine Learning Work? I. Regression 2. Classification 3. Clustering 4. Underfitting and overfitting 5. Optimization 6. Convex Optimization 7. Gradient Descent 8. Stochastic gradient Descent 9. Quasi-Newton Optimization Methods 10. Generative vs Discriminative models 6. Logistic Regression 1. The logistic function 2. Understanding Logistic Regression Output 7. Evaluating Models 1. The confusion matrix 8. Building an Understanding of Machine Learning 3. 2. Foundations of Neural Networks and Deep Learning 1. Neural Networks 1. The Biological Neuron 2. The Perceptron 3. Multi-Layer Feed-Forward Networks 2. Training Neural Networks 1. Backpropagation Learning 3. Activation functions Inear 2. Sigmoid 3. Tanh 4 Hard tanh 5. Softmax 6. Rectified Linear 4. Loss Functions 1. Loss Function notation 2. Loss Functions for Regression 3. Loss Functions for classification 4. Loss Functions for reconstruction 5. Hyperparameters 1. Learning rate 2. Regularization 3. Momentum 4. Sparsity 4. 3. Fundamentals of Deep Networks 1. Defining Deep Learning 1. What is Deep Learning? 2. Organization of This Chapter 2. Common Architectural Principals of Deep Networks 1. Parameters layers 3. Activation Functions 4. Loss Functions 5. Optimization Algorithms 6. Hyperparameters 7. Summa 3. Building Blocks of Deep Networks 1. Restricted boltzmann machines 2. Autoencoders 3. ariational autoencoders 5. 4. Major Architectures of Deep Networks 1. Unsupervised Pre-Trained Networks 1. Deep Belief Networks 2. Generative Adversarial Networks 2. Convolutional neural Networks 1. Biological Inspiration 2. Intuition 3. Convolutional network architecture Overview 4. Input layers 5. Convolutional Layers 6. Pooling layers 7. Fully-Connected Layers 8. Other Applications of Convolutional Networks 9. Convolutional Network architectures of note 10. Summary 3. Recurrent Neural Networks 1. Modeling the Time Dimension 2. 3D Volumetric Input 3. Why Not Markov Models? 4. General recurrent Network architecture 5. Long Short- Term Memory LsTM) Networks 6. Domain Specific Applications and blended Networks 4. Recursive neural networks 1. Network architecture 2. Varieties of Recursive Neural Networks 3. Applications of recursive Neural Networks 5. Summary and discussion 1. Will Deep Learning Make Other Algorithms Obsolete? 2. Different problems have different best methods 3. When Do I Need Deep Learning? 6. 5. Building Deep Networks 1. Matching Deep Networks to the right Problem 1. Columnar Data and Multi-Layer Perceptrons 2. Images and Convolutional Neural Networks 3. Timeseries Sequences and Recurrent Neural Networks 4. Using Hybrid Networks 2. The dl4j Suite of tools 1. Vectorization and datavec 2. Runtimes and nd4j 3. Basic Concepts of the dl4J AP 1. Loading and saving models 2. Getting Input For the Model 3. Setting Up Model Architecture 4. Training and evaluation 4. Modeling CSV Data with Multi-Layer Perceptron Networks 1. Setting up Input data 2. Determining Network Architecture 3. Training the Model 4. Evaluating the Model 5. Modeling hand-Written Images with Convolutional Neural Networks 1. Java code listing for lenet convolutional network 2. Loading and Vectorizing the Input Images 3. Network Architecture for leNet in dL4J 4. Training the Convolutional Network 6. Modeling Sequence Data with Recurrent Neural Networks 1. Generating Shakespeare with LSTMs 2. Classifying Sensor Timeseries Sequences with LSTMS 7. USing Autoencoders for Anomaly Detection 1. Java Code listing for Autoencoder example 2. Setting Up Input Data 3. Autoencoder Network Architecture and Training 4. Evaluating the Model 8. Using Variational Autoencoders to Reconstruct MNIST Digits 1. Code listing to Reconstruct MNIST Digits 2. Examining the VAE Model 9. Applications of Deep Learning in Natural Language Processing 1. Learning Word Embeddings with Word2Vec 2. Distributed Representations of Sentences with Paragraph Vectors 3. Using Paragraph Vectors for Document Classification 7. 6. Tuning Deep Networks 1. Basic Concepts in Tuning deep Networks 1. An Intuition for Building Deep Networks 2. Building the Intuition as a Step-by-Step Process 2. Matching Input Data and Network Architectures 1. Summary 3. Relating Model Goal and Output Layers 1. Regression Model Output Layer 2. Classification Model Output Layer 4. Working with Layer Count, Parameter Count, and memory 1. Feed-Forward Multi-Layer Networks 2. Controlling Layer and Parameter Counts 3. Estimating Network Memory requirements 5. Weight Initialization Strategies 6. USing Activation Functions 1. Summary Table for Activation Functions 7. Applying loss functions 8. Understanding Learning rates 1. USing the Ratio of Parameters to Updates 2. Specific Recommendations for Learning rates 9. How Sparsity affects learning 10. Applying methods of Optimization 1. Stochastic gradient descent best practices 1. Leveraging Parallelization and GPUs for Faster Training 1. Online Learning and Parallel Iterative Algorithms 2. Parallelizing Stochastic Gradient Descent in DL4J 3. GPUS 12. Controlling Epochs and Mini-Batch Size 1. Understanding Mini-Batch Size Tradeoffs 13. How to Use Regularization 1. Priors as Regularizers 2. Max-Norm Regularization 3. Dropout 4. Other Topics in regularization 14. Working with Class Imbalance 1. Methods for Sampling Cl asses 2. Weighted Loss Functions I5. Dealing with Overfitting 16. USing Network Statistics from the Tuning UI 1. Detecting Poor Weight Initialization 2. Detecting Non-Shuffled Data 3. Detecting Issues with Regularization 8. 7. Tuning Specific Deep Network Architectures 1. Convolutional neural networks 1. Common convolutional architectural patterns 2. Configuring Convolutional Layers 3. Configuring Pooling Layers 4. Transfer Learning 2. Recurrent Neural Networks 1. Network Input Data and Input Layers 2. Output Layers and Rnnoutputlayer 3. Training the network 4. Debugging Common Issues with LSTMs 5. Padding and masking 6. Evaluation and Scoring with Masking 7. Variants of Recurrent Network architectures 3. Restricted Boltzmann machines 1. Hidden Units and Modeling Available Information 2. Leveraging Different Units 3. Using regularization with RBMs 4. Deep belief Networks 1. Using momentum 2. USing regularization 3. Determining Hidden Unit Count 9.8. Vectorization 1. Introduction to Vectorization in Machine Learning 1. Why do We Need to Vectorize Data? 2. Strategies For Dealing with Columnar Raw Data Attributes 3. Feature Engineering and Normalization Techniques averaging Data Vec for ETL and Vectorization 3. Vectorizing Image data I. Image Data Representation in DL 4J 2. Image Data and Vector Normalization with data vec 4. Working with Sequential Data in Vectorization Major variations of Sequential Data Sources 2. Vectorizing Sequential Data with Data Vec 5. Working with Text in Vectorization 1. Bag of words 2. Term Frequency Inverse Document Frequency(TF-IDEy 3. Word2vec and Vector Space Model Comparison 6. Working with Graphs 10.2. Using Deep Learning and DL4J on Spark 1. Introduction to Using DL4J with Spark and Hadoop 1. Operating Spark from the Command Line 2. Configuring and Tuning Spark execution 1. Running Spark on Mesos 2. Running Spark on YARN 3. General Spark Tuning Guide 4. Tuning dl4J Jobs on Spark 3. Setting Up a Maven POM for Spark and DL4J 1. A Pom. xml File Dependency template 2. Setting Up a POM File for CDh5.X 3. Setting Up a POM file for HDP 2. 4 4. Troubleshooting Spark and Hadoop 1. Common issues with nd4J 5. DL4J Parallel Execution on Spark 1. A Minimal Spark training Example 6. DL4J API Best Practices for Spark 7. Multi-Layer Perceptron Spark Example 1.Setting Up MLP Network Architecture for Spark 2. Distributed Training and Model evaluation 3. Building and executing a dl4J Spark Job 8. Generating Shakespeare Text with Spark and LSTMs 1. Setting Up the LsTM Network Architecture 2. Training, Tracking Progress, and Understanding results 9. Modeling Mnist with a Convolutional Neural Network on Spark 1. Configuring the Spark Job and Loading MNist data 2. Setting Up the LeNet CNN Architecture and Training 11. A. What is Artificial Intelligence? 1. The Story so far 1. Defining Deep learning 2. Defining Artificial Intelligence 2. What is Driving Interest Today in Artificial Intelligence 3. Winter Is Coming 12. B. RL4J and reinforcement Learning 1. Preliminaries 1. Markov decision process 2. Terminology Different Settings 1. Model-Fr ree 2. Observation setting 3. Single player and Adversarial Games eaning 1. From Policy to Neural Network 2. Policy Iteration 3. Exploration vs Exploitation 4. Bellman equation 5. Initial State Sampling 6. Q-Learning Implementation 7. Modeling Q(s,a 8. Experience Replay

试读 127P OReilly.Deep.Learning.2017
立即下载 低至0.43元/次 身份认证VIP会员低至7折
codemosi 非常贵的积分。。。擦,不值得
dzmne 这是草稿版,不值得下载,浪费我分数
  • 至尊王者

关注 私信
OReilly.Deep.Learning.2017 50积分/C币 立即下载

试读结束, 可继续阅读

50积分/C币 立即下载