Translated with Google Translate (corrections welcome)
# 1. Copyright statement
Please respect the author's intellectual property rights, copyright, piracy will be investigated. It is strictly forbidden to forward content without permission!
Please work together to maintain the results of your work and supervise. It is strictly forbidden to forward content without permission!
2018.6.27 TanJiyong
# 2. Overview
This project is to integrate the relevant knowledge of AI and brainstorm ideas to form a comprehensive and comprehensive collection of articles.
# 3. Join and document specifications
1. Seek friends, editors, and writers who are willing to continue to improve; if you are interested in cooperation, improve the book (become a co-author).
2. All contributors who submit content will reflect the contributor's personal information in the text (eg: Daxie-West Lake University)
3, in order to make the content more complete and thoughtful, brainstorming, welcome to Fork the project and participate in the preparation. Please note your name-unit (Dayu-Stanford University) while modifying the MD file (or direct message). Once adopted, the contributor's information will be displayed in the original text, thank you!
4. It is recommended to use the typora-Markdown reader: https://typora.io/
Setting:
File->Preference
- Syntax Support
- Inline Math
- Subscript
- Superscript
- Highlight
- Diagrams
Check these items on
Example:
```markdown
### 3.3.2 How to find the optimal value of the hyperparameter? (Contributor: Daxie - Stanford University)
There are always some difficult hyperparameters when using machine learning algorithms. For example, weight attenuation size, Gaussian kernel width, and so on. The algorithm does not set these parameters, but instead requires you to set their values. The set value has a large effect on the result. Common practices for setting hyperparameters are:
1. Guess and check: Select parameters based on experience or intuition, and iterate over.
2. Grid Search: Let the computer try to evenly distribute a set of values within a certain range.
3. Random search: Let the computer randomly pick a set of values.
4. Bayesian optimization: Using Bayesian optimization of hyperparameters, it is difficult to meet the Bayesian optimization algorithm itself.
5. Perform local optimization with good initial guessing: this is the MITIE method, which uses the BOBYQA algorithm and has a carefully chosen starting point. Since BOBYQA only looks for the nearest local optimal solution, the success of this method depends largely on whether there is a good starting point. In the case of MITIE, we know a good starting point, but this is not a universal solution, because usually you won't know where the good starting point is. On the plus side, this approach is well suited to finding local optimal solutions. I will discuss this later.
6. The latest global optimization method for LIPO. This method has no parameters and is proven to be better than a random search method.
```
# 4. Contributions and Project Overview
Submitted MD version chapter: Please check MarkDown
# 5. More
1. Seek friends, editors, and writers who are willing to continue to improve; if you are interested in cooperation, improve the book (become a co-author).
All contributors who submit content will reflect the contributor's personal information in the article (Dalong - West Lake University).
2. Contact: Please contact scutjy2015@163.com (the only official email); WeChat Tan:
(Into the group, after the MD version is added, improved, and submitted, it is easier to enter the group and enjoy sharing knowledge to help others.)
Into the "Deep Learning 500 Questions" WeChat group please add WeChat Client 1: HQJ199508212176 Client 2: Xuwumin1203 Client 3: tianyuzy
3. Markdown reader recommendation: https://typora.io/ Free and support for mathematical formulas is better.
4. Note that there are now criminals pretending to be promoters, please let the partners know!
5. Next, the MD version will be provided, and everyone will edit it together, so stay tuned! I hope to make suggestions and add changes!
# 6. Contents
**Chapter 1 Mathematical Foundation 1**
1.1 The relationship between scalars, vectors, and tensors 1
1.2 What is the difference between tensor and matrix? 1
1.3 Matrix and vector multiplication results 1
1.4 Vector and matrix norm induction 1
1.5 How to judge a matrix to be positive? 2
1.6 Derivative Bias Calculation 3
What is the difference between 1.7 derivatives and partial derivatives? 3
1.8 Eigenvalue decomposition and feature vector 3
1.9 What is the relationship between singular values and eigenvalues? 4
1.10 Why should machine learning use probabilities? 4
1.11 What is the difference between a variable and a random variable? 4
1.12 Common probability distribution? 5
1.13 Example Understanding Conditional Probability 9
1.14 What is the difference between joint probability and edge probability? 10
1.15 Chain Law of Conditional Probability 10
1.16 Independence and conditional independence 11
1.17 Summary of Expectations, Variances, Covariances, Correlation Coefficients 11
**Chapter 2 Fundamentals of Machine Learning 14**
2.1 Various common algorithm illustrations 14
2.2 Supervised learning, unsupervised learning, semi-supervised learning, weak supervised learning? 15
2.3 What are the steps for supervised learning? 16
2.4 Multi-instance learning? 17
2.5 What is the difference between classification networks and regression? 17
2.6 What is a neural network? 17
2.7 Advantages and Disadvantages of Common Classification Algorithms? 18
2.8 Is the correct rate good for evaluating classification algorithms? 20
2.9 How to evaluate the classification algorithm? 20
2.10 What kind of classifier is the best? twenty two
2.11 The relationship between big data and deep learning 22
2.12 Understanding Local Optimization and Global Optimization 23
2.13 Understanding Logistic Regression 24
2.14 What is the difference between logistic regression and naive Bayes? twenty four
2.15 Why do you need a cost function? 25
2.16 Principle of the function of the cost function 25
2.17 Why is the cost function non-negative? 26
2.18 Common cost function? 26
2.19 Why use cross entropy instead of quadratic cost function 28
2.20 What is a loss function? 28
2.21 Common loss function 28
2.22 Why does logistic regression use a logarithmic loss function? 30
How does the logarithmic loss function measure loss? 31
2.23 Why do gradients need to be reduced in machine learning? 32
2.24 What are the disadvantages of the gradient descent method? 32
2.25 Gradient descent method intuitive understanding? 32
2.23 What is the description of the gradient descent algorithm? 33
2.24 How to tune the gradient descent method? 35
2.25 What is the difference between random gradients and batch gradients? 35
2.26 Performance Comparison of Various Gradient Descent Methods 37
2.27 Calculation of the derivative calculation diagram of the graph? 37
2.28 Summary of Linear Discriminant Analysis (LDA) Thoughts 39
2.29 Graphical LDA Core Ideas 39
2.30 Principles of the second class LDA algorithm? 40
2.30 LDA algorithm flow summary? 41
2.31 What is the difference between LDA and PCA? 41
2.32 LDA advantages and disadvantages? 41
2.33 Summary of Principal Component Analysis (PCA) Thoughts 42
2.34 Graphical PCA Core Ideas 42
2.35 PCA algorithm reasoning 43
2.36 Summary of PCA Algorithm Flow 44
2.37 Main advantages and disadvantages of PCA algorithm 45
2.38 Necessity and purpose of dimensionality reduction 45
2.39 What is the difference between KPCA and PCA? 46
2.40 Model Evaluation 47
2.40.1 Common methods for model evaluation? 47
2.40.2 Empirical error and generalization error 47
2.40.3 Graphic under-fitting, over-fitting 48
2.40.4 How to solve over-fitting and under-fitting? 49
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深度学习500问机器学习/深
资源推荐
资源详情
资源评论
收起资源包目录
机器学习/深度学习500问 (1153个子文件)
1.bmp 2.2MB
1.doc 49KB
深度学习500问-Tan-00目录.docx 32KB
3-20.gif 984KB
3-20.gif 984KB
8.1.11.gif 565KB
3-17.gif 190KB
3-17.gif 190KB
image11.GIF 102KB
image11.GIF 102KB
2-22.gif 14KB
2-22.gif 14KB
2-21.gif 12KB
2-21.gif 12KB
2-20.gif 7KB
2-20.gif 7KB
image18.jpeg 650KB
image18.jpeg 650KB
image15.jpeg 194KB
image15.jpeg 194KB
image2.jpeg 179KB
image2.jpeg 179KB
image20.jpeg 138KB
image20.jpeg 138KB
two_trans_wgan.jpeg 132KB
image22.jpeg 95KB
image22.jpeg 95KB
image21.jpeg 82KB
image23.jpeg 65KB
image25.jpeg 61KB
image25.jpeg 61KB
tflite_artc.JPEG 57KB
image21.jpeg 55KB
QNNPACK3.jpeg 41KB
image19.jpeg 41KB
image19.jpeg 41KB
QNNPACK2.jpeg 37KB
QNNPACK4.jpeg 36KB
image28.jpeg 31KB
image28.jpeg 31KB
cycle衰减.jpeg 25KB
image27.jpeg 24KB
image27.jpeg 24KB
QNNPACK1.jpeg 24KB
余弦cycle衰减.jpeg 22KB
噪声线性余弦衰减.jpeg 21KB
多项式衰减.jpeg 17KB
余弦衰减.jpeg 17KB
逆时衰减.jpeg 15KB
指数衰减.jpeg 15KB
image60.jpeg 12KB
image60.jpeg 12KB
线性余弦衰减.jpeg 11KB
3-7.jpg 510KB
3-7.jpg 510KB
1.jpg 388KB
1.jpg 388KB
低秩分解模型压缩加速.jpg 305KB
2.jpg 262KB
2.jpg 262KB
2.jpg 259KB
2.jpg 259KB
3.jpg 182KB
3.jpg 182KB
1.jpg 172KB
2.16.4.1.jpg 172KB
1.jpg 172KB
2.16.4.1.jpg 172KB
18-5-6-1.jpg 159KB
09cb64cf3f3ceb5d87369e41c29d34bf.jpg 151KB
1.jpg 148KB
1.jpg 148KB
93f17238115b84db66ab5d56cfddacdd.jpg 144KB
7.jpg 143KB
7.jpg 143KB
figure_9.3_4.jpg 126KB
d8427680025580b726d99b1143bffd94.jpg 124KB
aa10d36f758430dd4ff72d2bf6a76a6c.jpg 116KB
966b912ade32d8171f18d2ea79b91b04.jpg 112KB
ab78b954fa6c763bdc99072fb5a1f670.jpg 111KB
4abacd82901988c3e0a98bdb07b2abc6.jpg 110KB
2.19.5A.jpg 108KB
2.19.5A.jpg 108KB
figure_9.3_1.jpg 107KB
不同学习率.jpg 103KB
4761dc00b96100769281e7c90011d09b.jpg 102KB
1cb62df2f6209e4c20fe5592dcc62918.jpg 101KB
18-5-9-0.jpg 101KB
07e7da7afdea5c38f470b3a06d8137e2.jpg 93KB
figure_9.1.3_1.jpg 92KB
fa08900e89bfd53cc28345d21bc6aca0.jpg 91KB
figure_9.1.2_1.jpg 84KB
tanh.jpg 82KB
1.jpg 79KB
1.jpg 79KB
3-11.jpg 74KB
3-11.jpg 74KB
2.20.1.jpg 74KB
2.20.1.jpg 74KB
2.20.1.jpg 73KB
共 1153 条
- 1
- 2
- 3
- 4
- 5
- 6
- 12
资源评论
落难Coder
- 粉丝: 8620
- 资源: 12
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功